Jan 25 05:38:28 crc systemd[1]: Starting Kubernetes Kubelet... Jan 25 05:38:28 crc restorecon[4565]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 25 05:38:28 crc restorecon[4565]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 25 05:38:29 crc kubenswrapper[4728]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 25 05:38:29 crc kubenswrapper[4728]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 25 05:38:29 crc kubenswrapper[4728]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 25 05:38:29 crc kubenswrapper[4728]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 25 05:38:29 crc kubenswrapper[4728]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 25 05:38:29 crc kubenswrapper[4728]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.200037 4728 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202732 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202748 4728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202753 4728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202757 4728 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202761 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202765 4728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202769 4728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202773 4728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202776 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202780 4728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202783 4728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202786 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202789 4728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202793 4728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202798 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202801 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202804 4728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202807 4728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202811 4728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202817 4728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202820 4728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202823 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202827 4728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202830 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202833 4728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202837 4728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202840 4728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202843 4728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202846 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202849 4728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202852 4728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202855 4728 feature_gate.go:330] unrecognized feature gate: Example Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202858 4728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202862 4728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202866 4728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202869 4728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202872 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202877 4728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202882 4728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202886 4728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202890 4728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202894 4728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202897 4728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202900 4728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202903 4728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202906 4728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202909 4728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202912 4728 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202915 4728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202918 4728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202921 4728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202925 4728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202928 4728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202931 4728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202935 4728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202938 4728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202942 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202945 4728 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202948 4728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202959 4728 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202962 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202965 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202968 4728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202971 4728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202974 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202977 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202982 4728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202988 4728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202992 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202995 4728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.202999 4728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203482 4728 flags.go:64] FLAG: --address="0.0.0.0" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203494 4728 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203500 4728 flags.go:64] FLAG: --anonymous-auth="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203506 4728 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203511 4728 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203514 4728 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203519 4728 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203524 4728 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203528 4728 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203531 4728 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203535 4728 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203538 4728 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203542 4728 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203546 4728 flags.go:64] FLAG: --cgroup-root="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203549 4728 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203552 4728 flags.go:64] FLAG: --client-ca-file="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203556 4728 flags.go:64] FLAG: --cloud-config="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203561 4728 flags.go:64] FLAG: --cloud-provider="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203564 4728 flags.go:64] FLAG: --cluster-dns="[]" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203572 4728 flags.go:64] FLAG: --cluster-domain="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203576 4728 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203580 4728 flags.go:64] FLAG: --config-dir="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203583 4728 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203588 4728 flags.go:64] FLAG: --container-log-max-files="5" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203593 4728 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203597 4728 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203602 4728 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203606 4728 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203610 4728 flags.go:64] FLAG: --contention-profiling="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203614 4728 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203618 4728 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203621 4728 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203626 4728 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203631 4728 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203635 4728 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203638 4728 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203642 4728 flags.go:64] FLAG: --enable-load-reader="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203646 4728 flags.go:64] FLAG: --enable-server="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203649 4728 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203655 4728 flags.go:64] FLAG: --event-burst="100" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203659 4728 flags.go:64] FLAG: --event-qps="50" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203663 4728 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203666 4728 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203670 4728 flags.go:64] FLAG: --eviction-hard="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203675 4728 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203679 4728 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203683 4728 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203687 4728 flags.go:64] FLAG: --eviction-soft="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203691 4728 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203695 4728 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203699 4728 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203702 4728 flags.go:64] FLAG: --experimental-mounter-path="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203706 4728 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203709 4728 flags.go:64] FLAG: --fail-swap-on="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203713 4728 flags.go:64] FLAG: --feature-gates="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203717 4728 flags.go:64] FLAG: --file-check-frequency="20s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203721 4728 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203725 4728 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203729 4728 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203732 4728 flags.go:64] FLAG: --healthz-port="10248" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203735 4728 flags.go:64] FLAG: --help="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203739 4728 flags.go:64] FLAG: --hostname-override="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203742 4728 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203745 4728 flags.go:64] FLAG: --http-check-frequency="20s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203749 4728 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203753 4728 flags.go:64] FLAG: --image-credential-provider-config="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203756 4728 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203760 4728 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203764 4728 flags.go:64] FLAG: --image-service-endpoint="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203767 4728 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203771 4728 flags.go:64] FLAG: --kube-api-burst="100" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203774 4728 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203778 4728 flags.go:64] FLAG: --kube-api-qps="50" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203781 4728 flags.go:64] FLAG: --kube-reserved="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203785 4728 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203788 4728 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203792 4728 flags.go:64] FLAG: --kubelet-cgroups="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203795 4728 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203799 4728 flags.go:64] FLAG: --lock-file="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203802 4728 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203805 4728 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203809 4728 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203814 4728 flags.go:64] FLAG: --log-json-split-stream="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203818 4728 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203821 4728 flags.go:64] FLAG: --log-text-split-stream="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203825 4728 flags.go:64] FLAG: --logging-format="text" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203828 4728 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203832 4728 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203836 4728 flags.go:64] FLAG: --manifest-url="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203839 4728 flags.go:64] FLAG: --manifest-url-header="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203844 4728 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203848 4728 flags.go:64] FLAG: --max-open-files="1000000" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203854 4728 flags.go:64] FLAG: --max-pods="110" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203857 4728 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203861 4728 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203864 4728 flags.go:64] FLAG: --memory-manager-policy="None" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203867 4728 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203871 4728 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203874 4728 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203878 4728 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203887 4728 flags.go:64] FLAG: --node-status-max-images="50" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203890 4728 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203894 4728 flags.go:64] FLAG: --oom-score-adj="-999" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203897 4728 flags.go:64] FLAG: --pod-cidr="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203901 4728 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203909 4728 flags.go:64] FLAG: --pod-manifest-path="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203912 4728 flags.go:64] FLAG: --pod-max-pids="-1" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203916 4728 flags.go:64] FLAG: --pods-per-core="0" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203920 4728 flags.go:64] FLAG: --port="10250" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203924 4728 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203928 4728 flags.go:64] FLAG: --provider-id="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203932 4728 flags.go:64] FLAG: --qos-reserved="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203936 4728 flags.go:64] FLAG: --read-only-port="10255" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203939 4728 flags.go:64] FLAG: --register-node="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203943 4728 flags.go:64] FLAG: --register-schedulable="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203947 4728 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203963 4728 flags.go:64] FLAG: --registry-burst="10" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203967 4728 flags.go:64] FLAG: --registry-qps="5" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203971 4728 flags.go:64] FLAG: --reserved-cpus="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203974 4728 flags.go:64] FLAG: --reserved-memory="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203980 4728 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203984 4728 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203987 4728 flags.go:64] FLAG: --rotate-certificates="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203991 4728 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203995 4728 flags.go:64] FLAG: --runonce="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.203998 4728 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204002 4728 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204006 4728 flags.go:64] FLAG: --seccomp-default="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204009 4728 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204013 4728 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204017 4728 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204021 4728 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204025 4728 flags.go:64] FLAG: --storage-driver-password="root" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204029 4728 flags.go:64] FLAG: --storage-driver-secure="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204032 4728 flags.go:64] FLAG: --storage-driver-table="stats" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204036 4728 flags.go:64] FLAG: --storage-driver-user="root" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204040 4728 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204043 4728 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204047 4728 flags.go:64] FLAG: --system-cgroups="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204051 4728 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204058 4728 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204061 4728 flags.go:64] FLAG: --tls-cert-file="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204065 4728 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204072 4728 flags.go:64] FLAG: --tls-min-version="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204075 4728 flags.go:64] FLAG: --tls-private-key-file="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204079 4728 flags.go:64] FLAG: --topology-manager-policy="none" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204082 4728 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204086 4728 flags.go:64] FLAG: --topology-manager-scope="container" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204089 4728 flags.go:64] FLAG: --v="2" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204095 4728 flags.go:64] FLAG: --version="false" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204099 4728 flags.go:64] FLAG: --vmodule="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204107 4728 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204111 4728 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204212 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204216 4728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204221 4728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204224 4728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204228 4728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204231 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204234 4728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204237 4728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204241 4728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204243 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204246 4728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204249 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204253 4728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204255 4728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204258 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204261 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204264 4728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204267 4728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204270 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204273 4728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204277 4728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204279 4728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204283 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204287 4728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204290 4728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204293 4728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204296 4728 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204299 4728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204302 4728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204306 4728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204309 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204313 4728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204329 4728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204334 4728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204337 4728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204341 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204344 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204347 4728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204350 4728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204360 4728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204364 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204367 4728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204370 4728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204373 4728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204376 4728 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204379 4728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204382 4728 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204386 4728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204389 4728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204392 4728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204395 4728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204399 4728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204403 4728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204406 4728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204409 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204412 4728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204415 4728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204418 4728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204422 4728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204425 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204429 4728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204433 4728 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204436 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204440 4728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204444 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204448 4728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204452 4728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204455 4728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204459 4728 feature_gate.go:330] unrecognized feature gate: Example Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204463 4728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.204467 4728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.204479 4728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.210907 4728 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.210934 4728 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211002 4728 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211010 4728 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211015 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211020 4728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211024 4728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211027 4728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211031 4728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211034 4728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211039 4728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211043 4728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211047 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211051 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211055 4728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211058 4728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211062 4728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211065 4728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211069 4728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211072 4728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211075 4728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211079 4728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211083 4728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211087 4728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211090 4728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211093 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211097 4728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211101 4728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211104 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211107 4728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211111 4728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211115 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211119 4728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211123 4728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211127 4728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211130 4728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211134 4728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211138 4728 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211141 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211145 4728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211148 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211152 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211156 4728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211162 4728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211166 4728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211171 4728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211175 4728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211179 4728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211183 4728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211186 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211200 4728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211204 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211207 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211211 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211215 4728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211219 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211223 4728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211226 4728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211229 4728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211234 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211237 4728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211241 4728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211244 4728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211248 4728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211251 4728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211254 4728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211258 4728 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211261 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211265 4728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211269 4728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211273 4728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211276 4728 feature_gate.go:330] unrecognized feature gate: Example Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211280 4728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.211286 4728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211388 4728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211395 4728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211398 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211402 4728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211405 4728 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211409 4728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211412 4728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211415 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211418 4728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211421 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211424 4728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211428 4728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211431 4728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211434 4728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211437 4728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211440 4728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211443 4728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211448 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211452 4728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211455 4728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211458 4728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211461 4728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211464 4728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211467 4728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211471 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211474 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211477 4728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211481 4728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211485 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211489 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211492 4728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211495 4728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211498 4728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211501 4728 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211508 4728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211511 4728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211514 4728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211517 4728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211520 4728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211523 4728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211526 4728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211529 4728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211532 4728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211535 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211538 4728 feature_gate.go:330] unrecognized feature gate: Example Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211541 4728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211544 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211546 4728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211549 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211553 4728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211557 4728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211560 4728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211563 4728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211566 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211569 4728 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211572 4728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211576 4728 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211578 4728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211581 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211584 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211587 4728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211590 4728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211593 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211596 4728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211599 4728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211603 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211606 4728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211609 4728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211614 4728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211619 4728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.211623 4728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.211629 4728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.213235 4728 server.go:940] "Client rotation is on, will bootstrap in background" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.215722 4728 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.215781 4728 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.216487 4728 server.go:997] "Starting client certificate rotation" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.216511 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.217141 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-27 08:42:48.852353233 +0000 UTC Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.217200 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.230561 4728 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.232358 4728 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.232412 4728 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.240451 4728 log.go:25] "Validated CRI v1 runtime API" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.258131 4728 log.go:25] "Validated CRI v1 image API" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.259649 4728 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.263823 4728 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-25-05-35-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.263854 4728 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.277028 4728 manager.go:217] Machine: {Timestamp:2026-01-25 05:38:29.275539873 +0000 UTC m=+0.311417873 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445404 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3ea98fc6-5f41-42ee-97d9-1061312a21b0 BootID:f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:fb:c1:80 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:fb:c1:80 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:5f:06:d5 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:f0:a0:9b Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:76:f4:53 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:66:6f:f7 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:ba:11:60:95:27:48 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8e:83:12:ea:a2:c7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.277198 4728 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.277294 4728 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.278419 4728 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.278571 4728 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.278605 4728 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.278786 4728 topology_manager.go:138] "Creating topology manager with none policy" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.278796 4728 container_manager_linux.go:303] "Creating device plugin manager" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.279071 4728 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.279099 4728 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.279180 4728 state_mem.go:36] "Initialized new in-memory state store" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.279257 4728 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.281000 4728 kubelet.go:418] "Attempting to sync node with API server" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.281019 4728 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.281039 4728 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.281050 4728 kubelet.go:324] "Adding apiserver pod source" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.281062 4728 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.284487 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.284511 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.284598 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.284626 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.285405 4728 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.286083 4728 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.286931 4728 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288015 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288037 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288044 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288051 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288064 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288070 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288077 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288088 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288095 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288102 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288112 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.288119 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.289181 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.289567 4728 server.go:1280] "Started kubelet" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.290287 4728 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.290313 4728 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 25 05:38:29 crc systemd[1]: Started Kubernetes Kubelet. Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.290836 4728 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.290873 4728 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.292132 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.292156 4728 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.292428 4728 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.292497 4728 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.292578 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 15:30:57.117935133 +0000 UTC Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.292636 4728 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.292809 4728 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.292939 4728 server.go:460] "Adding debug handlers to kubelet server" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.292978 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.293022 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.293030 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="200ms" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.292767 4728 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.50:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188de2b82d5f14a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-25 05:38:29.289538727 +0000 UTC m=+0.325416697,LastTimestamp:2026-01-25 05:38:29.289538727 +0000 UTC m=+0.325416697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.294204 4728 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.294221 4728 factory.go:55] Registering systemd factory Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.294231 4728 factory.go:221] Registration of the systemd container factory successfully Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.297996 4728 factory.go:153] Registering CRI-O factory Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.298039 4728 factory.go:221] Registration of the crio container factory successfully Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.298077 4728 factory.go:103] Registering Raw factory Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.298097 4728 manager.go:1196] Started watching for new ooms in manager Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.299160 4728 manager.go:319] Starting recovery of all containers Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304371 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304405 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304416 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304426 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304434 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304443 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304451 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304460 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304470 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304480 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304489 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304499 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304510 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304521 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304531 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304541 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304552 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304561 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304569 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304580 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304589 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.304598 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305486 4728 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305512 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305523 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305534 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305543 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305559 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305570 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305594 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305643 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305654 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305664 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305673 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305682 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305691 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305700 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305712 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305724 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305743 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305752 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305762 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305770 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305780 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305788 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305796 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305807 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305817 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305826 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305834 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305844 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305856 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305865 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305878 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305887 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305898 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305908 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305919 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305931 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305940 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305956 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305965 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305977 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305987 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.305996 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306005 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306015 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306023 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306032 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306041 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306051 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306059 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306095 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306105 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306114 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306123 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306132 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306142 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306151 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306161 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306170 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306179 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306188 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306198 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306208 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306217 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306227 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306237 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306247 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306255 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306265 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306274 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306285 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306296 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306306 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306328 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306339 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306349 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306359 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306368 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306377 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306388 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306397 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306404 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306413 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306427 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306436 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306447 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306456 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306466 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306477 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306486 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306498 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306508 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306516 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306526 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306535 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306544 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306552 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306562 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306570 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306579 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306587 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306596 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306607 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306616 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306626 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306635 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306644 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306652 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306661 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306672 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306680 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306690 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306699 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306707 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306715 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306724 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306733 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306742 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306752 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306762 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306771 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306779 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306788 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306796 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306804 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306813 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306823 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306832 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306842 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306851 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306859 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306869 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306878 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306887 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306897 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306911 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306921 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306933 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306942 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306956 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306972 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306983 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.306992 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307001 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307010 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307019 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307028 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307036 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307044 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307052 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307061 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307070 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307079 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307087 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307097 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307108 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307115 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307126 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307135 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307144 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307152 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307161 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307169 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307177 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307186 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307195 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307204 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307213 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307223 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307232 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307242 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307252 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307261 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307270 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307279 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307289 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307298 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307307 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307330 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307338 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307347 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307356 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307365 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307374 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307384 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307391 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307399 4728 reconstruct.go:97] "Volume reconstruction finished" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.307406 4728 reconciler.go:26] "Reconciler: start to sync state" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.316864 4728 manager.go:324] Recovery completed Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.325726 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.326233 4728 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.326978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.327005 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.327015 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.327586 4728 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.327599 4728 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.327609 4728 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.327614 4728 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.327625 4728 state_mem.go:36] "Initialized new in-memory state store" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.327636 4728 kubelet.go:2335] "Starting kubelet main sync loop" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.327675 4728 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.329268 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.329314 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.329421 4728 policy_none.go:49] "None policy: Start" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.332116 4728 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.332143 4728 state_mem.go:35] "Initializing new in-memory state store" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.374616 4728 manager.go:334] "Starting Device Plugin manager" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.374722 4728 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.374778 4728 server.go:79] "Starting device plugin registration server" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.375114 4728 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.375188 4728 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.375538 4728 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.375622 4728 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.375637 4728 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.382015 4728 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.428618 4728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.428750 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.429822 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.429855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.429870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.430119 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.430393 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.430475 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.430904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.430924 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.430934 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.431050 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.431248 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.431272 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.431517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.431552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.431565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.431997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432030 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432114 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432235 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432400 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432439 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.432987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433176 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433379 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433454 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433802 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433819 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.433832 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.434099 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.434133 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.434414 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.434452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.434466 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.434791 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.434819 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.434830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.475510 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.476784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.476816 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.476827 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.476878 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.477528 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.50:6443: connect: connection refused" node="crc" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.493973 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="400ms" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.509782 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.509821 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.509846 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.509867 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.509895 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.509971 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510005 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510031 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510058 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510079 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510127 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510169 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510194 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510213 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.510231 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611477 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611512 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611536 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611556 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611575 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611596 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611611 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611631 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611650 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611667 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611674 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611695 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611736 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611742 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611663 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611760 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611686 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611779 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611816 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611785 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611761 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611878 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611887 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611926 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611941 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.611986 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.612005 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.612006 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.612034 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.612096 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.678570 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.679613 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.679657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.679668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.679692 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.680026 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.50:6443: connect: connection refused" node="crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.780535 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.785480 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.805053 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-68f3e6433e6dbe10dcc229d5c31d7ce65423fce340bc30ba4a49223cee079b19 WatchSource:0}: Error finding container 68f3e6433e6dbe10dcc229d5c31d7ce65423fce340bc30ba4a49223cee079b19: Status 404 returned error can't find the container with id 68f3e6433e6dbe10dcc229d5c31d7ce65423fce340bc30ba4a49223cee079b19 Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.808311 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-aa4a914d3ee9f496f8892f2aae09e4bdcde9b56204f445949fcb908affc80d16 WatchSource:0}: Error finding container aa4a914d3ee9f496f8892f2aae09e4bdcde9b56204f445949fcb908affc80d16: Status 404 returned error can't find the container with id aa4a914d3ee9f496f8892f2aae09e4bdcde9b56204f445949fcb908affc80d16 Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.808558 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.819765 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3508543608c6eccc0d10792fdd63be567f6875e1e5a7ebbdcb1793e7a938e3fd WatchSource:0}: Error finding container 3508543608c6eccc0d10792fdd63be567f6875e1e5a7ebbdcb1793e7a938e3fd: Status 404 returned error can't find the container with id 3508543608c6eccc0d10792fdd63be567f6875e1e5a7ebbdcb1793e7a938e3fd Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.822077 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: I0125 05:38:29.827316 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.835684 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b0784f94457b208c970dce8fd6ee54c99e81f677a323c4b47609ef5d10f66802 WatchSource:0}: Error finding container b0784f94457b208c970dce8fd6ee54c99e81f677a323c4b47609ef5d10f66802: Status 404 returned error can't find the container with id b0784f94457b208c970dce8fd6ee54c99e81f677a323c4b47609ef5d10f66802 Jan 25 05:38:29 crc kubenswrapper[4728]: W0125 05:38:29.836942 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-22c405550c2be2956f9d36ebc41db0fd9e82f4ade3f35cdaa81bfa75b17ef50d WatchSource:0}: Error finding container 22c405550c2be2956f9d36ebc41db0fd9e82f4ade3f35cdaa81bfa75b17ef50d: Status 404 returned error can't find the container with id 22c405550c2be2956f9d36ebc41db0fd9e82f4ade3f35cdaa81bfa75b17ef50d Jan 25 05:38:29 crc kubenswrapper[4728]: E0125 05:38:29.895544 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="800ms" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.080878 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.082229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.082261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.082271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.082294 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 25 05:38:30 crc kubenswrapper[4728]: E0125 05:38:30.082738 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.50:6443: connect: connection refused" node="crc" Jan 25 05:38:30 crc kubenswrapper[4728]: W0125 05:38:30.136924 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:30 crc kubenswrapper[4728]: E0125 05:38:30.137005 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:30 crc kubenswrapper[4728]: W0125 05:38:30.283288 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:30 crc kubenswrapper[4728]: E0125 05:38:30.283647 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.291860 4728 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.292850 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 06:42:49.978425445 +0000 UTC Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.332462 4728 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad" exitCode=0 Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.332544 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.332657 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3508543608c6eccc0d10792fdd63be567f6875e1e5a7ebbdcb1793e7a938e3fd"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.332746 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.334020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.334037 4728 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d" exitCode=0 Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.334061 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.334084 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aa4a914d3ee9f496f8892f2aae09e4bdcde9b56204f445949fcb908affc80d16"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.334045 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.334117 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.334170 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.335069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.335088 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.335096 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.335620 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.335647 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"68f3e6433e6dbe10dcc229d5c31d7ce65423fce340bc30ba4a49223cee079b19"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.336912 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd" exitCode=0 Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.336986 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.337022 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"22c405550c2be2956f9d36ebc41db0fd9e82f4ade3f35cdaa81bfa75b17ef50d"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.337123 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.337790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.337812 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.337820 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.338574 4728 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51" exitCode=0 Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.338603 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.338618 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b0784f94457b208c970dce8fd6ee54c99e81f677a323c4b47609ef5d10f66802"} Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.338701 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.339296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.339316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.339340 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.341243 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.341861 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.341882 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.341892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:30 crc kubenswrapper[4728]: E0125 05:38:30.697034 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="1.6s" Jan 25 05:38:30 crc kubenswrapper[4728]: W0125 05:38:30.722873 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:30 crc kubenswrapper[4728]: E0125 05:38:30.722970 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.882967 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.884155 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.884183 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.884192 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:30 crc kubenswrapper[4728]: I0125 05:38:30.884219 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 25 05:38:30 crc kubenswrapper[4728]: E0125 05:38:30.884625 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.50:6443: connect: connection refused" node="crc" Jan 25 05:38:30 crc kubenswrapper[4728]: W0125 05:38:30.898720 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.50:6443: connect: connection refused Jan 25 05:38:30 crc kubenswrapper[4728]: E0125 05:38:30.898792 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.50:6443: connect: connection refused" logger="UnhandledError" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.236188 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.293288 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 03:58:32.539421626 +0000 UTC Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.342893 4728 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f" exitCode=0 Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.342991 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.343231 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.344177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.344216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.344225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.345156 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1f72c469e82b896507fa17efcce786a2a6851270684c72877a47d0179b5ac020"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.345254 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.346216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.346247 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.346257 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.349687 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.349766 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.349782 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.349934 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.350882 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.350928 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.350942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.352215 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.352253 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.352267 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.352347 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.352937 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.352961 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.352972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.373358 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.373395 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.373412 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.373424 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.373434 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32"} Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.373530 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.374300 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.374362 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.374374 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:31 crc kubenswrapper[4728]: I0125 05:38:31.573607 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.294138 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 01:59:27.903887256 +0000 UTC Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.378859 4728 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06" exitCode=0 Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.378989 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.379454 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06"} Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.379567 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.380157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.380190 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.380195 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.380237 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.380202 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.380251 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.485105 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.486236 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.486335 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.486395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:32 crc kubenswrapper[4728]: I0125 05:38:32.486466 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.294385 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 01:22:24.033385753 +0000 UTC Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.384274 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428"} Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.384340 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609"} Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.384356 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0"} Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.384365 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e"} Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.384374 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da"} Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.384443 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.384500 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.385506 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.385538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.385550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.385914 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.386008 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.386075 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:33 crc kubenswrapper[4728]: I0125 05:38:33.418194 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 25 05:38:34 crc kubenswrapper[4728]: I0125 05:38:34.295331 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 15:29:35.1457178 +0000 UTC Jan 25 05:38:34 crc kubenswrapper[4728]: I0125 05:38:34.386952 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:34 crc kubenswrapper[4728]: I0125 05:38:34.387774 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:34 crc kubenswrapper[4728]: I0125 05:38:34.387814 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:34 crc kubenswrapper[4728]: I0125 05:38:34.387825 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.157844 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.158010 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.159055 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.159154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.159216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.295897 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 14:11:57.491231599 +0000 UTC Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.584149 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.584282 4728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.584345 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.585349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.585419 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.585432 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.628929 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.629079 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.630076 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.630120 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:35 crc kubenswrapper[4728]: I0125 05:38:35.630132 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:36 crc kubenswrapper[4728]: I0125 05:38:36.217154 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:36 crc kubenswrapper[4728]: I0125 05:38:36.231163 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:36 crc kubenswrapper[4728]: I0125 05:38:36.296114 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 02:47:09.389598376 +0000 UTC Jan 25 05:38:36 crc kubenswrapper[4728]: I0125 05:38:36.391997 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:36 crc kubenswrapper[4728]: I0125 05:38:36.392844 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:36 crc kubenswrapper[4728]: I0125 05:38:36.392897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:36 crc kubenswrapper[4728]: I0125 05:38:36.392914 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.296501 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 06:53:57.479126619 +0000 UTC Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.393931 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.394728 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.394762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.394773 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.779602 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.779746 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.780579 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.780606 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:37 crc kubenswrapper[4728]: I0125 05:38:37.780615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:38 crc kubenswrapper[4728]: I0125 05:38:38.158031 4728 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 25 05:38:38 crc kubenswrapper[4728]: I0125 05:38:38.158072 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 25 05:38:38 crc kubenswrapper[4728]: I0125 05:38:38.296994 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 08:29:35.184571827 +0000 UTC Jan 25 05:38:38 crc kubenswrapper[4728]: I0125 05:38:38.941958 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 25 05:38:38 crc kubenswrapper[4728]: I0125 05:38:38.942129 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:38 crc kubenswrapper[4728]: I0125 05:38:38.943145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:38 crc kubenswrapper[4728]: I0125 05:38:38.943175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:38 crc kubenswrapper[4728]: I0125 05:38:38.943185 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:39 crc kubenswrapper[4728]: I0125 05:38:39.297603 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 00:20:35.034839846 +0000 UTC Jan 25 05:38:39 crc kubenswrapper[4728]: E0125 05:38:39.382167 4728 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.030889 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.031214 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.031950 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.031981 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.031990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.035054 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.297703 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:12:58.2761796 +0000 UTC Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.401044 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.402245 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.402305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.402315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:40 crc kubenswrapper[4728]: I0125 05:38:40.404681 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:41 crc kubenswrapper[4728]: E0125 05:38:41.239407 4728 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.293396 4728 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.298442 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:33:06.60326964 +0000 UTC Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.359042 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.359133 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.362372 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.362448 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.403263 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.404038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.404071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:41 crc kubenswrapper[4728]: I0125 05:38:41.404081 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:42 crc kubenswrapper[4728]: I0125 05:38:42.299228 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 08:20:17.05774217 +0000 UTC Jan 25 05:38:43 crc kubenswrapper[4728]: I0125 05:38:43.300143 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:59:32.660297374 +0000 UTC Jan 25 05:38:44 crc kubenswrapper[4728]: I0125 05:38:44.301198 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 19:34:42.899728302 +0000 UTC Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.301898 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 04:00:03.99084982 +0000 UTC Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.379557 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.390372 4728 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.589304 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.589460 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.589855 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.589915 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.590172 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.590203 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.590213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:45 crc kubenswrapper[4728]: I0125 05:38:45.593238 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.232111 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.232243 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.302278 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 06:06:05.293559494 +0000 UTC Jan 25 05:38:46 crc kubenswrapper[4728]: E0125 05:38:46.356371 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.357999 4728 trace.go:236] Trace[81195523]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Jan-2026 05:38:32.638) (total time: 13719ms): Jan 25 05:38:46 crc kubenswrapper[4728]: Trace[81195523]: ---"Objects listed" error: 13719ms (05:38:46.357) Jan 25 05:38:46 crc kubenswrapper[4728]: Trace[81195523]: [13.719326049s] [13.719326049s] END Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.358024 4728 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.358695 4728 trace.go:236] Trace[1235001713]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Jan-2026 05:38:32.093) (total time: 14264ms): Jan 25 05:38:46 crc kubenswrapper[4728]: Trace[1235001713]: ---"Objects listed" error: 14264ms (05:38:46.358) Jan 25 05:38:46 crc kubenswrapper[4728]: Trace[1235001713]: [14.264916308s] [14.264916308s] END Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.358718 4728 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.359615 4728 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.359775 4728 trace.go:236] Trace[1182959789]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Jan-2026 05:38:31.973) (total time: 14386ms): Jan 25 05:38:46 crc kubenswrapper[4728]: Trace[1182959789]: ---"Objects listed" error: 14386ms (05:38:46.359) Jan 25 05:38:46 crc kubenswrapper[4728]: Trace[1182959789]: [14.386178857s] [14.386178857s] END Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.359794 4728 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.360010 4728 trace.go:236] Trace[1917814475]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Jan-2026 05:38:33.081) (total time: 13278ms): Jan 25 05:38:46 crc kubenswrapper[4728]: Trace[1917814475]: ---"Objects listed" error: 13278ms (05:38:46.359) Jan 25 05:38:46 crc kubenswrapper[4728]: Trace[1917814475]: [13.278834296s] [13.278834296s] END Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.360027 4728 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 25 05:38:46 crc kubenswrapper[4728]: E0125 05:38:46.360426 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 25 05:38:46 crc kubenswrapper[4728]: I0125 05:38:46.791345 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.137331 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.140544 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.291704 4728 apiserver.go:52] "Watching apiserver" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.293531 4728 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.293814 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.294137 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.294165 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.294219 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.294395 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.294509 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.294547 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.294661 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.294744 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.294683 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.296406 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.296426 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.297116 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.297144 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.297224 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.297375 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.297387 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.297416 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.297444 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.302410 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 16:15:40.163893373 +0000 UTC Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.319169 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.326805 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.335186 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.342536 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.348865 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.356151 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.362732 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.368974 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.374419 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.393076 4728 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.418737 4728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.418775 4728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466714 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466750 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466772 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466791 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466811 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466829 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466855 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466872 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466887 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466901 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466919 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466936 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466952 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466967 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.466997 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467017 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467035 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467052 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467067 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467084 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467101 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467116 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467132 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467151 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467167 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467183 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467200 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467218 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467234 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467251 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467283 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467301 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467335 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467361 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467382 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467401 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467419 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467435 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467452 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467467 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467485 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467520 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467536 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467555 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467575 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467592 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467609 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467646 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467663 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467679 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467698 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467715 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467734 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467754 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467774 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467793 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467818 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468005 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468073 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468253 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468305 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468307 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468391 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468510 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.467809 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468680 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468709 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468732 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468756 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468779 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468798 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468817 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468872 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468895 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468900 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468919 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468908 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468929 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468988 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469099 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469570 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469638 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469855 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469772 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469800 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469818 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.468916 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469912 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469922 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.470109 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.470148 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.470473 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.470616 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.470683 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.470682 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.470741 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.470799 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471009 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471032 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471039 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471204 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.469933 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471251 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471280 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471300 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471339 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471390 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471413 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471432 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471450 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471461 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471466 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471468 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471514 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471539 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471606 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471627 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471645 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471665 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471682 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471740 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471762 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471772 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471780 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471875 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471900 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471918 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471939 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471956 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.471960 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472008 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472027 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472209 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472411 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472441 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472460 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472476 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472492 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472508 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472524 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472542 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472558 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472566 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472521 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472606 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472704 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472577 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472796 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472818 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472846 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472867 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472889 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472906 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472925 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472941 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472959 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472983 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472999 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473015 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473033 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473051 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473071 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473087 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473104 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473120 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473137 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473153 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473168 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473184 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473201 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474270 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474297 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474316 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474347 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474365 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474384 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474399 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474416 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474433 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474451 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474467 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474486 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474506 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474524 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474544 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474562 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474579 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474596 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474615 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474633 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474649 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474668 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474682 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474701 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474719 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474737 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474755 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474772 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474788 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474807 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474826 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474851 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474868 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474887 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474904 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474921 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474937 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474953 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474982 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474999 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475014 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475033 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475050 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475066 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475082 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475100 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475117 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475133 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475150 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475167 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475184 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475201 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475218 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475233 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475252 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475267 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475284 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475300 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475329 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475349 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475366 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475397 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475414 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475429 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475445 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472847 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472876 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.472940 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473246 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473255 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473301 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473463 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473620 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473588 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473651 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473671 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473703 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473799 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473820 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473847 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473858 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473104 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.473978 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474282 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474788 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474872 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474868 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.474917 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475142 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475155 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475174 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475672 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475452 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475453 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475772 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475795 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475794 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475807 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475820 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475913 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475927 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475951 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475955 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475463 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.475978 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476002 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476023 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476050 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476129 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476146 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476150 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476174 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476189 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476208 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476218 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476240 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476257 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476267 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476278 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476281 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476295 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476380 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476402 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476419 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476434 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476450 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476473 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476482 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476528 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476533 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476553 4728 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476604 4728 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476620 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476630 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476641 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476652 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476661 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476670 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476682 4728 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476691 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476701 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476710 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476718 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476726 4728 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476735 4728 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476743 4728 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476751 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476760 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476769 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476777 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476786 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476795 4728 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476804 4728 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476812 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476821 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476830 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476850 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476859 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476868 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476878 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476887 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476895 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476904 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476913 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476921 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476931 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476941 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476941 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476669 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476703 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476782 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476897 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.476949 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477023 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477035 4728 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477045 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477057 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477066 4728 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477076 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477085 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477088 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477095 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477132 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477148 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477161 4728 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.477165 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.477240 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:47.977223476 +0000 UTC m=+19.013101456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477252 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477297 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477312 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477338 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477349 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477360 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477370 4728 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477379 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477392 4728 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477402 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477410 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477419 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477427 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477437 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477446 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477457 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477466 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477476 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477488 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477488 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477498 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477512 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.477763 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.478456 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.478648 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.479539 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.479867 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.479919 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:47.979905211 +0000 UTC m=+19.015783192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.480026 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.479609 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.480206 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.480519 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.480550 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.480805 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.481060 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.481104 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.481118 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.480690 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.480914 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.481544 4728 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.481627 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.481689 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.481782 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.481822 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.482112 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.482183 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.482240 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.482609 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.482676 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.482884 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.483128 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.483361 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.483437 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.483590 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.483858 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.483962 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.484065 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.484234 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.484419 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.484605 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.484783 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.484993 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.485190 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.485420 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.485641 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.485744 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.485873 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:38:47.985856238 +0000 UTC m=+19.021734218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.485915 4728 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486065 4728 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486076 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486085 4728 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486096 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486106 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486115 4728 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486124 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486133 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486142 4728 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486150 4728 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486158 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486167 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486175 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486184 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486193 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.485897 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486471 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486723 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486733 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.486902 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.487160 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.487170 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.487257 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.487577 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.487595 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.487706 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488346 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488797 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488021 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488057 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488142 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488444 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488671 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488693 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488951 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.488975 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.489017 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.489203 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.489224 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.489236 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.489269 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:47.989261525 +0000 UTC m=+19.025139506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.489280 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.489284 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.489580 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.489611 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.489885 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.489752 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.490108 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.490156 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.490240 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.490375 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.490377 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.490410 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.490510 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.492334 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.492354 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.492368 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.492402 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:47.992391654 +0000 UTC m=+19.028269634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.492778 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493026 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493061 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493110 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493163 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493218 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493219 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493478 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493483 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493537 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493707 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.493738 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.494188 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.494702 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.494907 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.495431 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.495977 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.496094 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.496420 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.496448 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.497184 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.497412 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.497629 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.498843 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.500451 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.501173 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.507106 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.511631 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.514733 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.520980 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587044 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587113 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587151 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587163 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587150 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587174 4728 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587234 4728 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587246 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587256 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587267 4728 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587277 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587279 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587331 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587342 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587350 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587360 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587369 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587378 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587389 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587398 4728 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587407 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587415 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587423 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587431 4728 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587438 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587446 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587454 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587461 4728 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587470 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587478 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587487 4728 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587495 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587504 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587513 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587523 4728 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587533 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587542 4728 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587550 4728 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587558 4728 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587566 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587573 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587583 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587591 4728 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587599 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587608 4728 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587616 4728 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587624 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587632 4728 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587640 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587648 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587656 4728 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587665 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587674 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587684 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587692 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587700 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587708 4728 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587717 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587725 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587732 4728 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587740 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587749 4728 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587759 4728 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587768 4728 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587777 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587785 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587794 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587803 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587811 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587819 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587827 4728 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587846 4728 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587854 4728 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587863 4728 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587871 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587880 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587888 4728 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587897 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587906 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587914 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587922 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587930 4728 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587938 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587946 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587954 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587962 4728 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587971 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587979 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587987 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.587995 4728 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588002 4728 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588011 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588018 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588027 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588034 4728 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588041 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588049 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588057 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588064 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588073 4728 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588081 4728 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588090 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588098 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588107 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588114 4728 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588123 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588131 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588140 4728 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588148 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.588156 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.608535 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.612648 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.618804 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 25 05:38:47 crc kubenswrapper[4728]: W0125 05:38:47.621711 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-13d1b8d653599b257b69d8ec32419a1d81bbc800fc51b0f6d4673acc180e4617 WatchSource:0}: Error finding container 13d1b8d653599b257b69d8ec32419a1d81bbc800fc51b0f6d4673acc180e4617: Status 404 returned error can't find the container with id 13d1b8d653599b257b69d8ec32419a1d81bbc800fc51b0f6d4673acc180e4617 Jan 25 05:38:47 crc kubenswrapper[4728]: W0125 05:38:47.627698 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-3fbbedca3e1a3c79a1bafa4ea00851b7bb2ff5f931912f0eb96a187c12969a60 WatchSource:0}: Error finding container 3fbbedca3e1a3c79a1bafa4ea00851b7bb2ff5f931912f0eb96a187c12969a60: Status 404 returned error can't find the container with id 3fbbedca3e1a3c79a1bafa4ea00851b7bb2ff5f931912f0eb96a187c12969a60 Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.991997 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992175 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:38:48.992143595 +0000 UTC m=+20.028021585 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.992566 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.992609 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.992634 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:47 crc kubenswrapper[4728]: I0125 05:38:47.992658 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992702 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992706 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992743 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:48.992734868 +0000 UTC m=+20.028612848 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992755 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992770 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992783 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992786 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992797 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992805 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992760 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:48.99275185 +0000 UTC m=+20.028629830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992824 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:48.992814347 +0000 UTC m=+20.028692337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:47 crc kubenswrapper[4728]: E0125 05:38:47.992854 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:48.992848822 +0000 UTC m=+20.028726792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.302855 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 03:24:40.99529302 +0000 UTC Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.417489 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327"} Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.417534 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"72bc2d675026778f4d249790bbf3db5e3af5e2f73cf85b71ab47d3733b710556"} Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.418437 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3fbbedca3e1a3c79a1bafa4ea00851b7bb2ff5f931912f0eb96a187c12969a60"} Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.419947 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45"} Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.419984 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9"} Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.419996 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"13d1b8d653599b257b69d8ec32419a1d81bbc800fc51b0f6d4673acc180e4617"} Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.434759 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.455494 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.465343 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.474500 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.483202 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.491971 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.503242 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.515634 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.525755 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.534080 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.543081 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.551730 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.560287 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.568370 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.576175 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.585641 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.964165 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.973274 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.973884 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.975922 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.982274 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.990125 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.999892 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:48 crc kubenswrapper[4728]: I0125 05:38:48.999943 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:48.999965 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:48.999983 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.000002 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000098 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000113 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000124 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000161 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:51.000151006 +0000 UTC m=+22.036028986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000178 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:38:51.00016881 +0000 UTC m=+22.036046800 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000225 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000235 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000243 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000264 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:51.000257907 +0000 UTC m=+22.036135887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000289 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000308 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:51.000302812 +0000 UTC m=+22.036180792 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000123 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.000349 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:51.00034441 +0000 UTC m=+22.036222400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.000887 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.010659 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.018982 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.028483 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.035554 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.044052 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.052771 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.060462 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.067944 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.105253 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.116273 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.125877 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.134507 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.142478 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.303062 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 09:26:45.96165784 +0000 UTC Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.328466 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.328567 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.328676 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.328718 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.328770 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.329034 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.332026 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.332634 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.333535 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.334145 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.334703 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.335154 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.335860 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.336438 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.336995 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.337535 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.338053 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.338676 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.339132 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.339654 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.340135 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.341572 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.342436 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.343604 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.344215 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.344794 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.345114 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.345892 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.346717 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.347143 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.347823 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.348305 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.348901 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.349494 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.349943 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.350488 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.350950 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.351418 4728 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.351532 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.352865 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.353389 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.353808 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.356442 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.357818 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.358417 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.359277 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.359889 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.360918 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.361362 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.362260 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.363033 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.363962 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.364434 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.365258 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.365772 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.366837 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.367249 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.368012 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.368462 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.368940 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.368915 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.369849 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.370290 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.381437 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.393002 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.402364 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.411948 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.422414 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.427710 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4"} Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.432259 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.441147 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.449785 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.460077 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.474021 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.485029 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.495082 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.504332 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.513619 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.523866 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.560482 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.561910 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.561941 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.561954 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.562012 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.567178 4728 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.567383 4728 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.568173 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.568199 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.568209 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.568220 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.568229 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.581428 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.583656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.583681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.583691 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.583702 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.583709 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.592642 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.595197 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.595226 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.595238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.595250 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.595259 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.604489 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.608619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.608658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.608667 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.608681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.608691 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.618088 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.620612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.620637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.620646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.620657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.620664 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.629285 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:49 crc kubenswrapper[4728]: E0125 05:38:49.629409 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.630813 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.630846 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.630856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.630867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.630873 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.732378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.732440 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.732452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.732466 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.732476 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.834587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.834842 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.834856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.834868 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.834878 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.936305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.936349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.936360 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.936372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:49 crc kubenswrapper[4728]: I0125 05:38:49.936383 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:49Z","lastTransitionTime":"2026-01-25T05:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.038110 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.038140 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.038150 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.038160 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.038167 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.140224 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.140249 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.140257 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.140269 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.140277 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.242702 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.242742 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.242751 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.242764 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.242781 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.304041 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 09:39:57.109596466 +0000 UTC Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.345363 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.345394 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.345405 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.345419 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.345428 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.412233 4728 csr.go:261] certificate signing request csr-frqfb is approved, waiting to be issued Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.445897 4728 csr.go:257] certificate signing request csr-frqfb is issued Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.447369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.447393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.447405 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.447417 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.447425 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.549892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.549943 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.549955 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.549976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.549986 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.655242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.655287 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.655299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.655339 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.655356 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.757728 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.757772 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.757780 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.757797 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.757806 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.860007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.860059 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.860072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.860091 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.860102 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.962637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.962684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.962695 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.962714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:50 crc kubenswrapper[4728]: I0125 05:38:50.962724 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:50Z","lastTransitionTime":"2026-01-25T05:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.015092 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.015149 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.015175 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.015193 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015229 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:38:55.015209728 +0000 UTC m=+26.051087709 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.015257 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015290 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015336 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015351 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015380 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:55.01536481 +0000 UTC m=+26.051242791 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015386 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015399 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015351 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015440 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:55.015415235 +0000 UTC m=+26.051293215 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015444 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015474 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:55.015466722 +0000 UTC m=+26.051344702 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015475 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.015524 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-25 05:38:55.015500126 +0000 UTC m=+26.051378105 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.065333 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.065380 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.065390 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.065409 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.065422 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.167229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.167271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.167282 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.167352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.167368 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.200746 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vdkq2"] Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.201058 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-m8nhm"] Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.201210 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vdkq2" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.203498 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-w9dvd"] Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.203770 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.203857 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.203958 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kdxw7"] Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.204375 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.204413 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.204471 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.205712 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.206357 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.207936 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208068 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208461 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208525 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208547 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208604 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208670 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208772 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208774 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208863 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.208934 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.216727 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/191ff4fd-0d05-4097-b136-5f443120b4e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.216760 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-os-release\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.216779 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/191ff4fd-0d05-4097-b136-5f443120b4e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.216796 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-rootfs\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.216813 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-mcd-auth-proxy-config\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.216834 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-proxy-tls\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.216865 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-system-cni-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217018 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-system-cni-dir\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-os-release\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217067 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217082 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mcwp\" (UniqueName: \"kubernetes.io/projected/191ff4fd-0d05-4097-b136-5f443120b4e7-kube-api-access-6mcwp\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217101 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkx89\" (UniqueName: \"kubernetes.io/projected/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-kube-api-access-xkx89\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217118 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-cni-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217133 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-cnibin\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217158 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-cnibin\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.217273 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.230622 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.244402 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.258029 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.269260 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.269439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.269529 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.269608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.269691 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.272753 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.283289 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.293211 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.300580 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.304393 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 16:04:35.663299917 +0000 UTC Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.314095 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318406 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-cnibin\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318443 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2ffc038-3d70-4d2c-b150-e8529f622238-cni-binary-copy\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318469 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-k8s-cni-cncf-io\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318489 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-kubelet\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318509 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-mcd-auth-proxy-config\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318531 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-cnibin\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318625 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-cni-multus\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318680 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5klfk\" (UniqueName: \"kubernetes.io/projected/c2ffc038-3d70-4d2c-b150-e8529f622238-kube-api-access-5klfk\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318726 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bzdw\" (UniqueName: \"kubernetes.io/projected/55b49f7c-8776-49b9-9897-6553e57e202b-kube-api-access-8bzdw\") pod \"node-resolver-vdkq2\" (UID: \"55b49f7c-8776-49b9-9897-6553e57e202b\") " pod="openshift-dns/node-resolver-vdkq2" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318762 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-proxy-tls\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318795 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-netns\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318819 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-etc-kubernetes\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318883 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-system-cni-dir\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318904 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318921 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkx89\" (UniqueName: \"kubernetes.io/projected/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-kube-api-access-xkx89\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318944 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-cni-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318955 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-system-cni-dir\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318963 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-cnibin\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.318992 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/191ff4fd-0d05-4097-b136-5f443120b4e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319010 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-conf-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319028 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-os-release\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-socket-dir-parent\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319064 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/191ff4fd-0d05-4097-b136-5f443120b4e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319083 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-rootfs\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319099 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-daemon-config\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319117 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-hostroot\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319133 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-cni-bin\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319151 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-system-cni-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319167 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-multus-certs\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319182 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-cni-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319184 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55b49f7c-8776-49b9-9897-6553e57e202b-hosts-file\") pod \"node-resolver-vdkq2\" (UID: \"55b49f7c-8776-49b9-9897-6553e57e202b\") " pod="openshift-dns/node-resolver-vdkq2" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319247 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-os-release\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319274 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-mcd-auth-proxy-config\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319273 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mcwp\" (UniqueName: \"kubernetes.io/projected/191ff4fd-0d05-4097-b136-5f443120b4e7-kube-api-access-6mcwp\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319394 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-cnibin\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319405 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319439 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/191ff4fd-0d05-4097-b136-5f443120b4e7-os-release\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319476 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-rootfs\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319485 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-system-cni-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319253 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-os-release\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319689 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/191ff4fd-0d05-4097-b136-5f443120b4e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.319847 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/191ff4fd-0d05-4097-b136-5f443120b4e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.323450 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.325885 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-proxy-tls\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.328403 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.328509 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.328407 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.328599 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.328503 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:38:51 crc kubenswrapper[4728]: E0125 05:38:51.328864 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.336846 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkx89\" (UniqueName: \"kubernetes.io/projected/d10b5a2b-cd5b-4f07-a2a3-06c2c8437002-kube-api-access-xkx89\") pod \"machine-config-daemon-w9dvd\" (UID: \"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\") " pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.337962 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mcwp\" (UniqueName: \"kubernetes.io/projected/191ff4fd-0d05-4097-b136-5f443120b4e7-kube-api-access-6mcwp\") pod \"multus-additional-cni-plugins-m8nhm\" (UID: \"191ff4fd-0d05-4097-b136-5f443120b4e7\") " pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.339708 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.347993 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.357431 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.366926 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.372380 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.372485 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.372504 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.372525 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.372552 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.378532 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.388167 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.398738 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.409218 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.417256 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420040 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2ffc038-3d70-4d2c-b150-e8529f622238-cni-binary-copy\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420091 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-k8s-cni-cncf-io\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420112 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-kubelet\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420130 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-cni-multus\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420147 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5klfk\" (UniqueName: \"kubernetes.io/projected/c2ffc038-3d70-4d2c-b150-e8529f622238-kube-api-access-5klfk\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420165 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bzdw\" (UniqueName: \"kubernetes.io/projected/55b49f7c-8776-49b9-9897-6553e57e202b-kube-api-access-8bzdw\") pod \"node-resolver-vdkq2\" (UID: \"55b49f7c-8776-49b9-9897-6553e57e202b\") " pod="openshift-dns/node-resolver-vdkq2" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420190 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-netns\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420208 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-etc-kubernetes\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420232 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-cni-multus\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420251 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-k8s-cni-cncf-io\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420236 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-conf-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420370 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-socket-dir-parent\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420399 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-daemon-config\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420437 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-socket-dir-parent\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420299 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-netns\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420299 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-etc-kubernetes\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420443 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-cni-bin\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420470 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-cni-bin\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420200 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-var-lib-kubelet\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420520 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-hostroot\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420261 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-conf-dir\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420541 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55b49f7c-8776-49b9-9897-6553e57e202b-hosts-file\") pod \"node-resolver-vdkq2\" (UID: \"55b49f7c-8776-49b9-9897-6553e57e202b\") " pod="openshift-dns/node-resolver-vdkq2" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420564 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-multus-certs\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420559 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-hostroot\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420608 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c2ffc038-3d70-4d2c-b150-e8529f622238-host-run-multus-certs\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420643 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55b49f7c-8776-49b9-9897-6553e57e202b-hosts-file\") pod \"node-resolver-vdkq2\" (UID: \"55b49f7c-8776-49b9-9897-6553e57e202b\") " pod="openshift-dns/node-resolver-vdkq2" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.420960 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2ffc038-3d70-4d2c-b150-e8529f622238-cni-binary-copy\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.421046 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c2ffc038-3d70-4d2c-b150-e8529f622238-multus-daemon-config\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.425074 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.432898 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5klfk\" (UniqueName: \"kubernetes.io/projected/c2ffc038-3d70-4d2c-b150-e8529f622238-kube-api-access-5klfk\") pod \"multus-kdxw7\" (UID: \"c2ffc038-3d70-4d2c-b150-e8529f622238\") " pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.435302 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bzdw\" (UniqueName: \"kubernetes.io/projected/55b49f7c-8776-49b9-9897-6553e57e202b-kube-api-access-8bzdw\") pod \"node-resolver-vdkq2\" (UID: \"55b49f7c-8776-49b9-9897-6553e57e202b\") " pod="openshift-dns/node-resolver-vdkq2" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.436081 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.447126 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-25 05:33:50 +0000 UTC, rotation deadline is 2026-11-04 05:36:58.632948002 +0000 UTC Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.447173 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6791h58m7.185777358s for next certificate rotation Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.447301 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.457183 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.475000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.475054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.475067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.475084 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.475095 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.515391 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vdkq2" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.520422 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.526284 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kdxw7" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.531246 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:38:51 crc kubenswrapper[4728]: W0125 05:38:51.531831 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b49f7c_8776_49b9_9897_6553e57e202b.slice/crio-9537860af8d8c7253da15c378d068a4f0dcb7477d9f0b22ce10b009d1a695485 WatchSource:0}: Error finding container 9537860af8d8c7253da15c378d068a4f0dcb7477d9f0b22ce10b009d1a695485: Status 404 returned error can't find the container with id 9537860af8d8c7253da15c378d068a4f0dcb7477d9f0b22ce10b009d1a695485 Jan 25 05:38:51 crc kubenswrapper[4728]: W0125 05:38:51.533967 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod191ff4fd_0d05_4097_b136_5f443120b4e7.slice/crio-6aa8bbcf6b07ca7f61fff41d04461edb860e6b57e5381b07166c04313ae9c7d3 WatchSource:0}: Error finding container 6aa8bbcf6b07ca7f61fff41d04461edb860e6b57e5381b07166c04313ae9c7d3: Status 404 returned error can't find the container with id 6aa8bbcf6b07ca7f61fff41d04461edb860e6b57e5381b07166c04313ae9c7d3 Jan 25 05:38:51 crc kubenswrapper[4728]: W0125 05:38:51.542801 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ffc038_3d70_4d2c_b150_e8529f622238.slice/crio-faa15f2595213ff5980d56fdd7d54d8f35ba3c3aa4de09d11777f044912693f0 WatchSource:0}: Error finding container faa15f2595213ff5980d56fdd7d54d8f35ba3c3aa4de09d11777f044912693f0: Status 404 returned error can't find the container with id faa15f2595213ff5980d56fdd7d54d8f35ba3c3aa4de09d11777f044912693f0 Jan 25 05:38:51 crc kubenswrapper[4728]: W0125 05:38:51.544587 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd10b5a2b_cd5b_4f07_a2a3_06c2c8437002.slice/crio-c84421a11d0c9c2140bc9155ff449e654b69ec4afc6a66f3b0ce7ea92e03dda5 WatchSource:0}: Error finding container c84421a11d0c9c2140bc9155ff449e654b69ec4afc6a66f3b0ce7ea92e03dda5: Status 404 returned error can't find the container with id c84421a11d0c9c2140bc9155ff449e654b69ec4afc6a66f3b0ce7ea92e03dda5 Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.565911 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zmqrx"] Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.567156 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.570840 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.571019 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.571659 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.571959 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.572025 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.572107 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.581289 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.583125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.583157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.583167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.583181 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.583191 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.587629 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.598798 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.610625 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.621353 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.621847 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-kubelet\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.637291 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.649661 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.659246 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.667401 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.676349 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.686935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.686979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.686988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.686999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.687007 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.687293 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.698582 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.709051 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.720311 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724050 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-var-lib-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724168 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-env-overrides\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724204 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724284 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4hx\" (UniqueName: \"kubernetes.io/projected/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-kube-api-access-dw4hx\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724475 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-slash\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724523 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-kubelet\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724558 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-node-log\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724581 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-kubelet\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724589 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovn-node-metrics-cert\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724670 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-systemd-units\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724692 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-ovn\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724718 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-config\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724791 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-bin\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724833 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724854 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-netd\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724877 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-script-lib\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724965 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-log-socket\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.724990 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-systemd\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.725029 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-netns\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.725054 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-etc-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.725116 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.740644 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.789357 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.789406 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.789418 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.789440 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.789455 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825472 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-env-overrides\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825521 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825542 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4hx\" (UniqueName: \"kubernetes.io/projected/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-kube-api-access-dw4hx\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825561 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-slash\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825586 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-node-log\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825602 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovn-node-metrics-cert\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825622 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-systemd-units\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825639 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-ovn\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825658 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-config\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825676 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-bin\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825716 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-script-lib\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825735 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825756 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-netd\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825781 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-log-socket\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825803 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-netns\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825846 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-systemd\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825869 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-etc-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825884 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825884 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-node-log\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825952 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-bin\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825955 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.825892 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826334 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826396 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-netns\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826426 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-etc-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826443 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-slash\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826396 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-systemd-units\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826426 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-var-lib-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826429 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-ovn\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826476 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-var-lib-openvswitch\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826500 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-systemd\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826502 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-netd\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826516 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-log-socket\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826921 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-env-overrides\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.826958 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-script-lib\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.827018 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-config\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.830104 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovn-node-metrics-cert\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.840193 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4hx\" (UniqueName: \"kubernetes.io/projected/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-kube-api-access-dw4hx\") pod \"ovnkube-node-zmqrx\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.892573 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.892621 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.892633 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.892652 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.892664 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.903867 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:51 crc kubenswrapper[4728]: W0125 05:38:51.941736 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb5cbcd_e874_4d07_a231_0eb38ef5fc5b.slice/crio-9db384fe5a3596a2ad0fd792fae12d6dadd0d16e2a7c5f023504eacf4a041f33 WatchSource:0}: Error finding container 9db384fe5a3596a2ad0fd792fae12d6dadd0d16e2a7c5f023504eacf4a041f33: Status 404 returned error can't find the container with id 9db384fe5a3596a2ad0fd792fae12d6dadd0d16e2a7c5f023504eacf4a041f33 Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.994789 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.994856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.994870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.994907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:51 crc kubenswrapper[4728]: I0125 05:38:51.994925 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:51Z","lastTransitionTime":"2026-01-25T05:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.098209 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.098568 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.098578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.098602 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.098615 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.201373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.201419 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.201430 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.201446 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.201457 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.303907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.303946 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.303955 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.303970 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.303982 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.305047 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 00:29:30.036027575 +0000 UTC Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.406242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.406281 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.406291 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.406307 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.406331 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.434509 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f" exitCode=0 Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.434610 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.434803 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"9db384fe5a3596a2ad0fd792fae12d6dadd0d16e2a7c5f023504eacf4a041f33"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.436529 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.436566 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.436577 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"c84421a11d0c9c2140bc9155ff449e654b69ec4afc6a66f3b0ce7ea92e03dda5"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.438605 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdxw7" event={"ID":"c2ffc038-3d70-4d2c-b150-e8529f622238","Type":"ContainerStarted","Data":"9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.438713 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdxw7" event={"ID":"c2ffc038-3d70-4d2c-b150-e8529f622238","Type":"ContainerStarted","Data":"faa15f2595213ff5980d56fdd7d54d8f35ba3c3aa4de09d11777f044912693f0"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.440550 4728 generic.go:334] "Generic (PLEG): container finished" podID="191ff4fd-0d05-4097-b136-5f443120b4e7" containerID="ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e" exitCode=0 Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.440660 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" event={"ID":"191ff4fd-0d05-4097-b136-5f443120b4e7","Type":"ContainerDied","Data":"ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.440725 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" event={"ID":"191ff4fd-0d05-4097-b136-5f443120b4e7","Type":"ContainerStarted","Data":"6aa8bbcf6b07ca7f61fff41d04461edb860e6b57e5381b07166c04313ae9c7d3"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.442169 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vdkq2" event={"ID":"55b49f7c-8776-49b9-9897-6553e57e202b","Type":"ContainerStarted","Data":"2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.442272 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vdkq2" event={"ID":"55b49f7c-8776-49b9-9897-6553e57e202b","Type":"ContainerStarted","Data":"9537860af8d8c7253da15c378d068a4f0dcb7477d9f0b22ce10b009d1a695485"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.458530 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.469462 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.476229 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.486604 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.496563 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.505228 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.508637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.508666 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.508677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.508722 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.508735 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.518670 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.527675 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.537839 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.553184 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.562762 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.574284 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.585477 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.601536 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.612071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.612229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.612556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.612785 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.613022 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.614216 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.626861 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.637862 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.650755 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.663747 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.671617 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.683065 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.696924 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.708743 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.716744 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.716776 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.716787 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.716806 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.716817 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.722233 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.732147 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.748844 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.764054 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.779918 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:52Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.818586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.818632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.818643 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.818664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.818678 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.920557 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.920590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.920599 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.920618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:52 crc kubenswrapper[4728]: I0125 05:38:52.920629 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:52Z","lastTransitionTime":"2026-01-25T05:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.023517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.023554 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.023564 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.023580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.023591 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.126041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.126076 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.126085 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.126102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.126113 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.229540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.229572 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.229581 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.229594 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.229606 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.305443 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 02:36:28.349013988 +0000 UTC Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.328856 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.328953 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:53 crc kubenswrapper[4728]: E0125 05:38:53.328960 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:38:53 crc kubenswrapper[4728]: E0125 05:38:53.329170 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.329186 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:53 crc kubenswrapper[4728]: E0125 05:38:53.329302 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.336422 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.336469 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.336482 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.336497 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.336512 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.438857 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.438883 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.438892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.438904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.438913 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.449076 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.449135 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.449151 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.449165 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.449175 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.449185 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.450873 4728 generic.go:334] "Generic (PLEG): container finished" podID="191ff4fd-0d05-4097-b136-5f443120b4e7" containerID="d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558" exitCode=0 Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.450935 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" event={"ID":"191ff4fd-0d05-4097-b136-5f443120b4e7","Type":"ContainerDied","Data":"d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.460851 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.470525 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.482046 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.492173 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.501859 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.512411 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.522507 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.535877 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.541087 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.541268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.541280 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.541308 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.541333 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.551389 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.560975 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.571783 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.580925 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.592923 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.602792 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:53Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.644658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.644689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.644701 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.644716 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.644727 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.748239 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.748280 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.748290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.748306 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.748334 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.850831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.850876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.850888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.850911 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.850926 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.953369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.953423 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.953433 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.953449 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:53 crc kubenswrapper[4728]: I0125 05:38:53.953461 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:53Z","lastTransitionTime":"2026-01-25T05:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.055670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.055716 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.055728 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.055747 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.055763 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.158068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.158117 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.158128 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.158146 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.158159 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.262023 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.262079 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.262092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.262111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.262129 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.306243 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 19:48:27.400261929 +0000 UTC Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.364930 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.364977 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.364988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.365005 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.365018 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.455952 4728 generic.go:334] "Generic (PLEG): container finished" podID="191ff4fd-0d05-4097-b136-5f443120b4e7" containerID="942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c" exitCode=0 Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.456004 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" event={"ID":"191ff4fd-0d05-4097-b136-5f443120b4e7","Type":"ContainerDied","Data":"942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.466800 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.466838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.466851 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.466870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.466883 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.475485 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.485911 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.511929 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.525160 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.541275 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.556774 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.568769 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.568805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.568824 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.568840 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.568851 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.575428 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.585525 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.596766 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.605961 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.614428 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.624246 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.636759 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.652471 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.671204 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.671251 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.671263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.671286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.671305 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.774450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.774495 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.774505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.774521 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.774532 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.860714 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5kw62"] Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.861154 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.863166 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.863213 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.863258 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.863455 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.872218 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.876018 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.876050 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.876064 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.876078 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.876087 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.881215 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.894560 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.902200 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.909781 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.918371 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.927022 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.939771 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.953887 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.961761 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcwb\" (UniqueName: \"kubernetes.io/projected/12499e53-158e-42e5-ab05-3b37974a32e9-kube-api-access-wkcwb\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.961822 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12499e53-158e-42e5-ab05-3b37974a32e9-serviceca\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.961882 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12499e53-158e-42e5-ab05-3b37974a32e9-host\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.963112 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.972208 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.977878 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.977918 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.977931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.977951 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.977964 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:54Z","lastTransitionTime":"2026-01-25T05:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.983729 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:54 crc kubenswrapper[4728]: I0125 05:38:54.994409 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:54Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.003614 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.017140 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.062983 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.063103 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:39:03.06307709 +0000 UTC m=+34.098955080 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.063255 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkcwb\" (UniqueName: \"kubernetes.io/projected/12499e53-158e-42e5-ab05-3b37974a32e9-kube-api-access-wkcwb\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.063371 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.063459 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12499e53-158e-42e5-ab05-3b37974a32e9-serviceca\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.063521 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.063579 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:03.063568334 +0000 UTC m=+34.099446324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.063637 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.063720 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.063808 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.063891 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12499e53-158e-42e5-ab05-3b37974a32e9-host\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.063810 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.063996 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.063958 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12499e53-158e-42e5-ab05-3b37974a32e9-host\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.064015 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.064112 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:03.06409743 +0000 UTC m=+34.099975410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.063861 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.063878 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.064168 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.064178 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.064180 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:03.064148235 +0000 UTC m=+34.100026215 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.064255 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:03.064237423 +0000 UTC m=+34.100115403 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.064418 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/12499e53-158e-42e5-ab05-3b37974a32e9-serviceca\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.081044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.081069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.081079 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.081094 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.081105 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.081173 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkcwb\" (UniqueName: \"kubernetes.io/projected/12499e53-158e-42e5-ab05-3b37974a32e9-kube-api-access-wkcwb\") pod \"node-ca-5kw62\" (UID: \"12499e53-158e-42e5-ab05-3b37974a32e9\") " pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.171006 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5kw62" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.182795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.182837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.182848 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.182867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.182879 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:55 crc kubenswrapper[4728]: W0125 05:38:55.185195 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12499e53_158e_42e5_ab05_3b37974a32e9.slice/crio-4e46415f40a4c5ec26353691f9e0b2ccf31e4376ffcdaa5be353a2267b3cba0a WatchSource:0}: Error finding container 4e46415f40a4c5ec26353691f9e0b2ccf31e4376ffcdaa5be353a2267b3cba0a: Status 404 returned error can't find the container with id 4e46415f40a4c5ec26353691f9e0b2ccf31e4376ffcdaa5be353a2267b3cba0a Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.285279 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.285357 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.285371 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.285394 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.285406 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.306369 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 10:54:02.917700571 +0000 UTC Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.328040 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.328049 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.328058 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.328140 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.328236 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:38:55 crc kubenswrapper[4728]: E0125 05:38:55.328406 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.387212 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.387246 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.387260 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.387277 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.387287 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.461099 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" event={"ID":"191ff4fd-0d05-4097-b136-5f443120b4e7","Type":"ContainerDied","Data":"0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.461432 4728 generic.go:334] "Generic (PLEG): container finished" podID="191ff4fd-0d05-4097-b136-5f443120b4e7" containerID="0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af" exitCode=0 Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.468080 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.469561 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5kw62" event={"ID":"12499e53-158e-42e5-ab05-3b37974a32e9","Type":"ContainerStarted","Data":"a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.469600 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5kw62" event={"ID":"12499e53-158e-42e5-ab05-3b37974a32e9","Type":"ContainerStarted","Data":"4e46415f40a4c5ec26353691f9e0b2ccf31e4376ffcdaa5be353a2267b3cba0a"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.477595 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.486711 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.489242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.489273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.489285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.489301 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.489312 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.494809 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.506852 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.517986 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.531384 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.541524 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.551477 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.561043 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.569353 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.577390 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.585392 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.591264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.591300 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.591310 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.591344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.591356 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.594427 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.603087 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.617997 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.633617 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.644405 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.654008 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.664954 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.678953 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.689131 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.693839 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.693885 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.693900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.693922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.693937 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.701652 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.711027 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.721569 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.731275 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.741490 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.750558 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.760142 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.770977 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.780347 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:55Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.796445 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.796500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.796510 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.796541 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.796563 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.898810 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.898847 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.898855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.898871 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:55 crc kubenswrapper[4728]: I0125 05:38:55.898885 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:55Z","lastTransitionTime":"2026-01-25T05:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.001569 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.001609 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.001621 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.001638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.001648 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.105761 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.105796 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.105839 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.105853 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.105864 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.207779 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.207806 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.207823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.207853 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.207861 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.307738 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 22:37:38.042865561 +0000 UTC Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.313637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.313729 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.313741 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.313758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.313772 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.415684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.415846 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.415906 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.415961 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.416013 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.474352 4728 generic.go:334] "Generic (PLEG): container finished" podID="191ff4fd-0d05-4097-b136-5f443120b4e7" containerID="a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3" exitCode=0 Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.474358 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" event={"ID":"191ff4fd-0d05-4097-b136-5f443120b4e7","Type":"ContainerDied","Data":"a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.486747 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.496587 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.508345 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.518069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.518102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.518112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.518127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.518140 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.520041 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.532780 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.542788 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.551476 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.560608 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.570559 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.577875 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.587754 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.601803 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.615910 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.619821 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.619851 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.619862 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.619881 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.619892 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.624971 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.632350 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:56Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.722133 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.722166 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.722175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.722188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.722199 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.824725 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.824762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.824771 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.824790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.824801 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.926717 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.926742 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.926751 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.926764 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:56 crc kubenswrapper[4728]: I0125 05:38:56.926774 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:56Z","lastTransitionTime":"2026-01-25T05:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.028435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.028458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.028466 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.028477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.028485 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.130418 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.130450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.130463 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.130479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.130488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.232336 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.232370 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.232379 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.232394 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.232408 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.307959 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 12:20:17.453208272 +0000 UTC Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.331688 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.331793 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:57 crc kubenswrapper[4728]: E0125 05:38:57.331953 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.332027 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:57 crc kubenswrapper[4728]: E0125 05:38:57.332135 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:38:57 crc kubenswrapper[4728]: E0125 05:38:57.332312 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.339365 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.339408 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.339422 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.339444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.339458 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.442480 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.442752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.442762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.442779 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.442793 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.480682 4728 generic.go:334] "Generic (PLEG): container finished" podID="191ff4fd-0d05-4097-b136-5f443120b4e7" containerID="56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071" exitCode=0 Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.480767 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" event={"ID":"191ff4fd-0d05-4097-b136-5f443120b4e7","Type":"ContainerDied","Data":"56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.487729 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.488335 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.488368 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.491743 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.506105 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.512122 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.512202 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.522078 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.531065 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.542693 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.545312 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.545413 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.545428 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.545473 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.545488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.553025 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.562601 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.572990 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.584394 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.594705 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.604707 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.614281 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.623401 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.632713 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.641931 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.648959 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.648985 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.648995 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.649014 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.649025 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.652286 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.660826 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.669633 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.679105 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.688073 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.697583 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.705466 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.713047 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.720697 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.727102 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.735850 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.751148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.751216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.751228 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.751248 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.751260 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.753026 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.766583 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.773913 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.781015 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:57Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.853478 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.853513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.853522 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.853537 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.853548 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.955041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.955070 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.955081 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.955094 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:57 crc kubenswrapper[4728]: I0125 05:38:57.955104 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:57Z","lastTransitionTime":"2026-01-25T05:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.056571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.056595 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.056605 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.056618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.056627 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.158870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.158901 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.158911 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.158925 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.158935 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.260167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.260203 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.260214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.260225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.260235 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.308362 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 04:05:17.854584338 +0000 UTC Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.362110 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.362160 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.362174 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.362189 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.362200 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.464268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.464298 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.464307 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.464331 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.464351 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.493058 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" event={"ID":"191ff4fd-0d05-4097-b136-5f443120b4e7","Type":"ContainerStarted","Data":"11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.493172 4728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.507475 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.516936 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.522987 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.531407 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.539034 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.547026 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.563099 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.566779 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.566830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.566841 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.566859 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.566870 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.587942 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.602647 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.614177 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.628831 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.645833 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.654836 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.666955 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.668541 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.668580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.668590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.668608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.668618 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.680080 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:58Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.772118 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.772169 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.772181 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.772198 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.772214 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.874898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.874939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.874950 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.874964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.874973 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.976958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.977007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.977017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.977032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:58 crc kubenswrapper[4728]: I0125 05:38:58.977044 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:58Z","lastTransitionTime":"2026-01-25T05:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.078785 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.078857 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.078870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.078892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.078906 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.181613 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.181648 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.181661 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.181678 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.181691 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.218181 4728 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.284233 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.284292 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.284305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.284349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.284366 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.308904 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 13:30:12.679989525 +0000 UTC Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.328342 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.328377 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.328491 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.328521 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.328663 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.328770 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.343923 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.357025 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.371726 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.382931 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.385703 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.385733 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.385744 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.385758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.385767 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.394985 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.403967 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.412369 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.425793 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.435774 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.444449 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.453298 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.462452 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.469866 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.479116 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.488251 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.488615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.488635 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.488645 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.488658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.488666 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.497589 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/0.log" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.500344 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0" exitCode=1 Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.500395 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.501161 4728 scope.go:117] "RemoveContainer" containerID="f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.509317 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.524450 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.537362 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.549592 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.559758 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.569656 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.578579 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.586683 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.590549 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.590576 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.590586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.590608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.590620 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.593575 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.601769 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.610883 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.619004 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.626717 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.634767 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.646476 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0125 05:38:59.156819 6018 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0125 05:38:59.157058 6018 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0125 05:38:59.157623 6018 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0125 05:38:59.157633 6018 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0125 05:38:59.157632 6018 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0125 05:38:59.157692 6018 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0125 05:38:59.157698 6018 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0125 05:38:59.157695 6018 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0125 05:38:59.157718 6018 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0125 05:38:59.157717 6018 handler.go:208] Removed *v1.Node event handler 7\\\\nI0125 05:38:59.157771 6018 handler.go:208] Removed *v1.Node event handler 2\\\\nI0125 05:38:59.157944 6018 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.693110 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.693148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.693156 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.693173 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.693182 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.795617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.795655 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.795668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.795689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.795701 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.898071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.898116 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.898131 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.898147 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.898158 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.910484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.910554 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.910570 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.910595 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.910617 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.924029 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.927160 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.927200 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.927211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.927227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.927237 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.937521 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.941313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.941382 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.941394 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.941407 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.941416 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.952022 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.955111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.955173 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.955185 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.955209 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.955222 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.964184 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.967197 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.967229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.967240 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.967256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:38:59 crc kubenswrapper[4728]: I0125 05:38:59.967269 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:38:59Z","lastTransitionTime":"2026-01-25T05:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.975993 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:38:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:38:59 crc kubenswrapper[4728]: E0125 05:38:59.976100 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.001112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.001140 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.001150 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.001165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.001178 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.103584 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.103628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.103642 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.103664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.103678 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.206030 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.206066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.206077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.206092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.206106 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.307877 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.307916 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.307926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.307941 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.307950 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.309975 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 12:33:05.549174445 +0000 UTC Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.409373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.409413 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.409425 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.409436 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.409446 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.504004 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/1.log" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.504511 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/0.log" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.506883 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8" exitCode=1 Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.506918 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.506946 4728 scope.go:117] "RemoveContainer" containerID="f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.507436 4728 scope.go:117] "RemoveContainer" containerID="f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8" Jan 25 05:39:00 crc kubenswrapper[4728]: E0125 05:39:00.507577 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.512001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.512029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.512038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.512053 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.512062 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.523336 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.531900 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.539875 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.549346 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.557668 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.565642 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.574894 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.583341 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.592448 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.602742 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.611841 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.614352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.614391 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.614407 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.614425 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.614439 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.621958 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.629967 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.638141 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.650671 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f650a609a2477df90923d89ac3235410795ea6f54d5491532582a977492ec2e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:38:59Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0125 05:38:59.156819 6018 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0125 05:38:59.157058 6018 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0125 05:38:59.157623 6018 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0125 05:38:59.157633 6018 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0125 05:38:59.157632 6018 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0125 05:38:59.157692 6018 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0125 05:38:59.157698 6018 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0125 05:38:59.157695 6018 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0125 05:38:59.157718 6018 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0125 05:38:59.157717 6018 handler.go:208] Removed *v1.Node event handler 7\\\\nI0125 05:38:59.157771 6018 handler.go:208] Removed *v1.Node event handler 2\\\\nI0125 05:38:59.157944 6018 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.716532 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.716583 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.716596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.716615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.716627 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.818546 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.818598 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.818608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.818625 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.818637 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.920207 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.920233 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.920245 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.920259 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:00 crc kubenswrapper[4728]: I0125 05:39:00.920302 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:00Z","lastTransitionTime":"2026-01-25T05:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.022505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.022537 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.022546 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.022556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.022564 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.124514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.124637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.124691 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.124748 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.124797 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.226583 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.226685 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.226741 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.226797 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.226857 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.310528 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 09:36:31.450979744 +0000 UTC Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.327781 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:01 crc kubenswrapper[4728]: E0125 05:39:01.327881 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.328144 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:01 crc kubenswrapper[4728]: E0125 05:39:01.328220 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.328304 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:01 crc kubenswrapper[4728]: E0125 05:39:01.328422 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.328488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.328508 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.328517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.328526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.328536 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.430070 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.430113 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.430126 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.430144 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.430157 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.510517 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/1.log" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.513355 4728 scope.go:117] "RemoveContainer" containerID="f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8" Jan 25 05:39:01 crc kubenswrapper[4728]: E0125 05:39:01.513486 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.522814 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.531760 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.531892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.531965 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.532024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.532095 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.532159 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.543038 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.552517 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.561119 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.568880 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.576811 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.583687 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.592112 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.601163 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.614275 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.628077 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.633521 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.633551 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.633562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.633576 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.633586 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.640828 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.648709 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.655462 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.735808 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.735856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.735870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.735894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.735909 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.838068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.838128 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.838141 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.838157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.838169 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.940087 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.940125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.940138 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.940158 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:01 crc kubenswrapper[4728]: I0125 05:39:01.940167 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:01Z","lastTransitionTime":"2026-01-25T05:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.042516 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.042551 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.042562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.042575 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.042585 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.144780 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.144837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.144852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.144874 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.144887 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.247183 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.247218 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.247229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.247242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.247250 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.310606 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 22:11:19.054878155 +0000 UTC Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.349563 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.349590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.349601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.349617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.349629 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.451825 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.451852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.451862 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.451876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.451888 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.554845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.554881 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.554890 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.554904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.554915 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.656951 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.657005 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.657019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.657032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.657040 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.720162 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9"] Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.720648 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.722354 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.722945 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.730397 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.739616 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.748789 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.758813 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.758847 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.758858 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.758871 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.758882 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.758876 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.768213 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.777029 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.789397 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.798819 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.806488 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.813605 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.827406 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5tw6\" (UniqueName: \"kubernetes.io/projected/a01f596a-1896-40e2-b9e8-990c387845a3-kube-api-access-d5tw6\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.827442 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a01f596a-1896-40e2-b9e8-990c387845a3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.827462 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a01f596a-1896-40e2-b9e8-990c387845a3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.827503 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a01f596a-1896-40e2-b9e8-990c387845a3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.829158 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.837483 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.846734 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.853883 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.860928 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.860967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.860977 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.860993 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.861003 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.862904 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.871137 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:02Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.936281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a01f596a-1896-40e2-b9e8-990c387845a3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.936956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a01f596a-1896-40e2-b9e8-990c387845a3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.936857 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5tw6\" (UniqueName: \"kubernetes.io/projected/a01f596a-1896-40e2-b9e8-990c387845a3-kube-api-access-d5tw6\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.937551 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a01f596a-1896-40e2-b9e8-990c387845a3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.937724 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a01f596a-1896-40e2-b9e8-990c387845a3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.938105 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a01f596a-1896-40e2-b9e8-990c387845a3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.942853 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a01f596a-1896-40e2-b9e8-990c387845a3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.948985 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5tw6\" (UniqueName: \"kubernetes.io/projected/a01f596a-1896-40e2-b9e8-990c387845a3-kube-api-access-d5tw6\") pod \"ovnkube-control-plane-749d76644c-jwzv9\" (UID: \"a01f596a-1896-40e2-b9e8-990c387845a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.964340 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.964896 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.964945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.964960 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:02 crc kubenswrapper[4728]: I0125 05:39:02.964969 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:02Z","lastTransitionTime":"2026-01-25T05:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.031188 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" Jan 25 05:39:03 crc kubenswrapper[4728]: W0125 05:39:03.040994 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda01f596a_1896_40e2_b9e8_990c387845a3.slice/crio-46a325185481dccbb70ff351e461ba0b9b5286268c065b964d389ce1c3445195 WatchSource:0}: Error finding container 46a325185481dccbb70ff351e461ba0b9b5286268c065b964d389ce1c3445195: Status 404 returned error can't find the container with id 46a325185481dccbb70ff351e461ba0b9b5286268c065b964d389ce1c3445195 Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.066940 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.066973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.066983 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.066998 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.067008 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.139760 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.139853 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.139886 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.139935 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:39:19.139902886 +0000 UTC m=+50.175780865 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.139966 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.140000 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140011 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140035 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140041 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140075 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140048 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140113 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:19.140094025 +0000 UTC m=+50.175972005 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140133 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:19.140124664 +0000 UTC m=+50.176002643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140147 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:19.140140883 +0000 UTC m=+50.176018864 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140159 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140177 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140188 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.140229 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:19.140221806 +0000 UTC m=+50.176099785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.168880 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.168958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.168968 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.169008 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.169032 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.271827 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.272154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.272164 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.272179 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.272190 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.311152 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 20:12:42.271612795 +0000 UTC Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.328618 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.328679 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.328624 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.328765 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.328904 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:03 crc kubenswrapper[4728]: E0125 05:39:03.328994 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.374596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.374623 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.374634 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.374649 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.374660 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.476541 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.476574 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.476586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.476600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.476613 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.521291 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" event={"ID":"a01f596a-1896-40e2-b9e8-990c387845a3","Type":"ContainerStarted","Data":"c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.521360 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" event={"ID":"a01f596a-1896-40e2-b9e8-990c387845a3","Type":"ContainerStarted","Data":"1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.521380 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" event={"ID":"a01f596a-1896-40e2-b9e8-990c387845a3","Type":"ContainerStarted","Data":"46a325185481dccbb70ff351e461ba0b9b5286268c065b964d389ce1c3445195"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.531246 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.544659 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.562832 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.571814 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.578983 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.579010 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.579020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.579034 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.579045 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.581042 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.591843 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.600674 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.608817 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.619390 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.627352 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.651651 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.667631 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.681870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.681912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.681923 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.681941 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.681956 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.686601 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.697878 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.705560 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.714813 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:03Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.784502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.784619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.784684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.784753 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.784824 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.886396 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.886429 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.886439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.886455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.886468 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.988332 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.988355 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.988363 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.988372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:03 crc kubenswrapper[4728]: I0125 05:39:03.988379 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:03Z","lastTransitionTime":"2026-01-25T05:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.089657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.089701 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.089710 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.089727 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.089737 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.191908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.191937 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.191946 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.191958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.191968 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.293690 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.293888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.293958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.294017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.294077 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.312172 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 05:00:48.807799241 +0000 UTC Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.396254 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.396287 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.396297 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.396310 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.396340 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.465506 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-k5pj4"] Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.466166 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:04 crc kubenswrapper[4728]: E0125 05:39:04.466249 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.479653 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.488101 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.494499 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.498022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.498045 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.498054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.498069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.498079 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.503411 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.511756 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.520810 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.531258 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.539713 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.547710 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.551182 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7th\" (UniqueName: \"kubernetes.io/projected/accc0eb5-6067-4ab9-bbab-6d2ae898942f-kube-api-access-4b7th\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.551224 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.556313 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.564559 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.572723 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.580175 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.586542 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.594449 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.599985 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.600013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.600021 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.600035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.600045 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.602277 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.614448 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:04Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.652055 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7th\" (UniqueName: \"kubernetes.io/projected/accc0eb5-6067-4ab9-bbab-6d2ae898942f-kube-api-access-4b7th\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.652222 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:04 crc kubenswrapper[4728]: E0125 05:39:04.652365 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:04 crc kubenswrapper[4728]: E0125 05:39:04.652433 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs podName:accc0eb5-6067-4ab9-bbab-6d2ae898942f nodeName:}" failed. No retries permitted until 2026-01-25 05:39:05.152414238 +0000 UTC m=+36.188292228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs") pod "network-metrics-daemon-k5pj4" (UID: "accc0eb5-6067-4ab9-bbab-6d2ae898942f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.665383 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7th\" (UniqueName: \"kubernetes.io/projected/accc0eb5-6067-4ab9-bbab-6d2ae898942f-kube-api-access-4b7th\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.702615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.702658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.702672 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.702692 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.702710 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.804837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.804888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.804902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.804920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.804930 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.906843 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.906877 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.906887 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.906903 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:04 crc kubenswrapper[4728]: I0125 05:39:04.906913 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:04Z","lastTransitionTime":"2026-01-25T05:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.009261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.009289 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.009298 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.009310 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.009334 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.111420 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.111457 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.111465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.111485 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.111496 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.156408 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:05 crc kubenswrapper[4728]: E0125 05:39:05.156579 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:05 crc kubenswrapper[4728]: E0125 05:39:05.156655 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs podName:accc0eb5-6067-4ab9-bbab-6d2ae898942f nodeName:}" failed. No retries permitted until 2026-01-25 05:39:06.156638513 +0000 UTC m=+37.192516493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs") pod "network-metrics-daemon-k5pj4" (UID: "accc0eb5-6067-4ab9-bbab-6d2ae898942f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.213145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.213165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.213175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.213191 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.213201 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.312568 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 10:05:41.591957448 +0000 UTC Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.315091 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.315199 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.315274 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.315381 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.315468 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.328447 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.328452 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:05 crc kubenswrapper[4728]: E0125 05:39:05.328679 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.328501 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:05 crc kubenswrapper[4728]: E0125 05:39:05.329004 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:05 crc kubenswrapper[4728]: E0125 05:39:05.328783 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.417268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.417410 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.417511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.417601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.417691 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.519220 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.519255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.519263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.519279 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.519291 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.621157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.621187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.621195 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.621212 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.621220 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.722837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.722972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.723035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.723116 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.723176 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.825066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.825129 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.825146 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.825168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.825183 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.825556 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.826268 4728 scope.go:117] "RemoveContainer" containerID="f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8" Jan 25 05:39:05 crc kubenswrapper[4728]: E0125 05:39:05.826492 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.927584 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.927634 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.927646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.927661 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:05 crc kubenswrapper[4728]: I0125 05:39:05.927670 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:05Z","lastTransitionTime":"2026-01-25T05:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.029340 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.029451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.029522 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.029596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.029654 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.131514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.131630 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.131714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.131786 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.131850 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.165355 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:06 crc kubenswrapper[4728]: E0125 05:39:06.165487 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:06 crc kubenswrapper[4728]: E0125 05:39:06.165561 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs podName:accc0eb5-6067-4ab9-bbab-6d2ae898942f nodeName:}" failed. No retries permitted until 2026-01-25 05:39:08.165541809 +0000 UTC m=+39.201419789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs") pod "network-metrics-daemon-k5pj4" (UID: "accc0eb5-6067-4ab9-bbab-6d2ae898942f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.233227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.233358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.233432 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.233497 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.233557 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.314009 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 03:58:12.647711383 +0000 UTC Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.328401 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:06 crc kubenswrapper[4728]: E0125 05:39:06.328547 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.335420 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.335470 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.335485 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.335505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.335524 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.437460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.437486 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.437495 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.437508 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.437516 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.539313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.539361 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.539372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.539383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.539390 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.641714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.641756 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.641766 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.641782 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.641801 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.743368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.743411 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.743426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.743436 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.743445 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.845647 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.845681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.845690 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.845704 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.845716 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.947148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.947182 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.947193 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.947205 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:06 crc kubenswrapper[4728]: I0125 05:39:06.947214 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:06Z","lastTransitionTime":"2026-01-25T05:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.049651 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.049722 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.049736 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.049752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.049763 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.151624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.151646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.151657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.151667 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.151677 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.252964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.252990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.252999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.253029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.253038 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.314832 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 08:42:00.18752861 +0000 UTC Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.328164 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.328214 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:07 crc kubenswrapper[4728]: E0125 05:39:07.328277 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.328306 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:07 crc kubenswrapper[4728]: E0125 05:39:07.328415 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:07 crc kubenswrapper[4728]: E0125 05:39:07.328563 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.354679 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.354705 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.354714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.354724 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.354731 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.456679 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.456715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.456724 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.456738 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.456749 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.558957 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.559008 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.559019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.559040 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.559048 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.660979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.661010 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.661019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.661048 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.661057 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.767438 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.767476 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.767485 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.767500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.767510 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.869771 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.869831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.869858 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.869876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.869889 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.972154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.972214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.972226 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.972248 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:07 crc kubenswrapper[4728]: I0125 05:39:07.972273 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:07Z","lastTransitionTime":"2026-01-25T05:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.074693 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.074737 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.074749 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.074767 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.074779 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.177556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.177593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.177603 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.177617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.177628 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.182032 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:08 crc kubenswrapper[4728]: E0125 05:39:08.182180 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:08 crc kubenswrapper[4728]: E0125 05:39:08.182235 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs podName:accc0eb5-6067-4ab9-bbab-6d2ae898942f nodeName:}" failed. No retries permitted until 2026-01-25 05:39:12.182218937 +0000 UTC m=+43.218096917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs") pod "network-metrics-daemon-k5pj4" (UID: "accc0eb5-6067-4ab9-bbab-6d2ae898942f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.278955 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.279008 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.279017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.279035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.279045 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.315604 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 21:23:25.902002839 +0000 UTC Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.327804 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:08 crc kubenswrapper[4728]: E0125 05:39:08.327927 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.381082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.381111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.381120 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.381131 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.381145 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.482624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.482661 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.482670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.482685 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.482698 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.584938 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.584962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.584971 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.584981 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.584990 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.687027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.687054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.687062 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.687071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.687080 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.788242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.788266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.788275 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.788284 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.788292 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.889627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.889653 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.889661 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.889671 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.889678 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.991255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.991282 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.991292 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.991302 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:08 crc kubenswrapper[4728]: I0125 05:39:08.991310 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:08Z","lastTransitionTime":"2026-01-25T05:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.093459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.093503 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.093518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.093537 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.093576 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.195856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.195889 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.195898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.195909 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.195917 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.297638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.297667 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.297677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.297687 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.297695 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.315955 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 22:14:57.484659807 +0000 UTC Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.327921 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.328077 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.328117 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:09 crc kubenswrapper[4728]: E0125 05:39:09.328152 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:09 crc kubenswrapper[4728]: E0125 05:39:09.328070 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:09 crc kubenswrapper[4728]: E0125 05:39:09.328299 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.338053 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.347015 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.355544 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.364734 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.371943 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.378443 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.387078 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.396105 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.400464 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.400506 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.400517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.400535 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.400551 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.407828 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.416758 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.424072 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.433196 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.442469 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.455832 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.472594 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.479848 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.486556 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.503500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.503533 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.503542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.503558 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.503571 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.605962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.606018 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.606030 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.606051 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.606064 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.707737 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.707768 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.707778 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.707802 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.707812 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.809975 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.810010 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.810020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.810036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.810046 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.912152 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.912175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.912184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.912194 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.912201 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.993513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.993542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.993551 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.993562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:09 crc kubenswrapper[4728]: I0125 05:39:09.993571 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:09Z","lastTransitionTime":"2026-01-25T05:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: E0125 05:39:10.002395 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:10Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.005026 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.005046 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.005055 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.005066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.005074 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: E0125 05:39:10.013660 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:10Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.021625 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.021654 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.021664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.021674 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.021681 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: E0125 05:39:10.030634 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:10Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.032956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.033045 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.033102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.033162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.033219 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: E0125 05:39:10.041382 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:10Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.044063 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.044149 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.044208 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.044264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.044344 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: E0125 05:39:10.052909 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:10Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:10 crc kubenswrapper[4728]: E0125 05:39:10.053032 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.054148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.054178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.054188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.054204 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.054213 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.155951 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.155982 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.155991 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.156002 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.156010 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.257178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.257344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.257431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.257489 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.257555 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.316211 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 05:18:36.704092288 +0000 UTC Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.328561 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:10 crc kubenswrapper[4728]: E0125 05:39:10.328706 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.359796 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.359833 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.359848 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.359867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.359877 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.461286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.461340 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.461349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.461360 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.461369 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.563269 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.563289 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.563297 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.563305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.563311 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.666155 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.666190 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.666200 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.666216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.666227 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.768277 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.768341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.768357 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.768376 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.768387 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.869916 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.869960 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.869973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.869986 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.869997 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.971718 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.971751 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.971763 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.971775 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:10 crc kubenswrapper[4728]: I0125 05:39:10.971799 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:10Z","lastTransitionTime":"2026-01-25T05:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.074005 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.074040 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.074052 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.074065 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.074075 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.175415 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.175451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.175459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.175472 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.175482 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.277053 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.277087 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.277096 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.277108 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.277115 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.316283 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 15:27:04.599564991 +0000 UTC Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.328124 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.328145 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.328151 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:11 crc kubenswrapper[4728]: E0125 05:39:11.328450 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:11 crc kubenswrapper[4728]: E0125 05:39:11.328516 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:11 crc kubenswrapper[4728]: E0125 05:39:11.328631 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.379448 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.379492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.379503 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.379516 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.379527 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.481765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.481818 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.481830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.481843 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.481854 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.584525 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.584560 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.584571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.584587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.584600 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.686272 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.686360 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.686378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.686397 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.686413 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.788145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.788179 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.788189 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.788204 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.788217 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.889875 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.889904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.889912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.889926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.889936 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.991344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.991371 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.991381 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.991393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:11 crc kubenswrapper[4728]: I0125 05:39:11.991401 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:11Z","lastTransitionTime":"2026-01-25T05:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.093069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.093102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.093113 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.093127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.093139 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.195430 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.195469 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.195477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.195511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.195522 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.219079 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:12 crc kubenswrapper[4728]: E0125 05:39:12.219186 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:12 crc kubenswrapper[4728]: E0125 05:39:12.219239 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs podName:accc0eb5-6067-4ab9-bbab-6d2ae898942f nodeName:}" failed. No retries permitted until 2026-01-25 05:39:20.219226015 +0000 UTC m=+51.255103986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs") pod "network-metrics-daemon-k5pj4" (UID: "accc0eb5-6067-4ab9-bbab-6d2ae898942f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.297666 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.297717 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.297729 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.297742 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.297752 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.317079 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 23:01:25.276644428 +0000 UTC Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.328427 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:12 crc kubenswrapper[4728]: E0125 05:39:12.328527 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.399238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.399266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.399277 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.399287 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.399294 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.501337 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.501357 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.501368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.501377 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.501384 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.602677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.602703 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.602711 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.602721 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.602729 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.703969 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.704029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.704038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.704049 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.704057 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.806557 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.806586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.806596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.806609 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.806617 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.908559 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.908601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.908613 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.908627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:12 crc kubenswrapper[4728]: I0125 05:39:12.908638 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:12Z","lastTransitionTime":"2026-01-25T05:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.010314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.010469 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.010526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.010590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.010661 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.112426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.112454 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.112464 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.112475 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.112489 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.214416 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.214440 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.214452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.214462 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.214470 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.316617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.316646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.316655 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.316668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.316676 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.317748 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 01:55:43.934680189 +0000 UTC Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.328042 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.328058 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:13 crc kubenswrapper[4728]: E0125 05:39:13.328127 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.328035 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:13 crc kubenswrapper[4728]: E0125 05:39:13.328196 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:13 crc kubenswrapper[4728]: E0125 05:39:13.328242 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.418347 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.418394 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.418402 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.418412 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.418584 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.520499 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.520530 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.520537 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.520550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.520562 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.622491 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.622521 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.622532 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.622543 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.622556 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.725166 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.725202 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.725210 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.725223 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.725233 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.826589 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.826618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.826628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.826638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.826645 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.928456 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.928480 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.928487 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.928496 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:13 crc kubenswrapper[4728]: I0125 05:39:13.928503 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:13Z","lastTransitionTime":"2026-01-25T05:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.030347 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.030372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.030380 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.030389 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.030396 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.132373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.132493 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.132545 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.132602 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.132660 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.234452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.234583 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.234674 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.234737 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.234807 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.318568 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 17:54:02.216495994 +0000 UTC Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.327830 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:14 crc kubenswrapper[4728]: E0125 05:39:14.327966 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.336541 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.336572 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.336582 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.336594 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.336602 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.437703 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.437725 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.437732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.437740 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.437747 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.538856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.538894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.538902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.538912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.538920 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.640475 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.640507 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.640516 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.640526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.640534 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.741640 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.741667 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.741678 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.741687 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.741694 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.843360 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.843395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.843405 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.843414 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.843421 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.945690 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.945716 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.945723 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.945733 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:14 crc kubenswrapper[4728]: I0125 05:39:14.945740 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:14Z","lastTransitionTime":"2026-01-25T05:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.047337 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.047359 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.047368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.047378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.047387 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.148439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.148465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.148474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.148484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.148491 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.249955 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.249994 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.250004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.250019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.250029 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.318687 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 18:53:18.641453344 +0000 UTC Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.328040 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.328039 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.328211 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:15 crc kubenswrapper[4728]: E0125 05:39:15.328147 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:15 crc kubenswrapper[4728]: E0125 05:39:15.328313 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:15 crc kubenswrapper[4728]: E0125 05:39:15.328372 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.351804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.351825 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.351834 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.351843 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.351850 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.453802 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.453833 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.453842 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.453852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.453863 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.555372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.555401 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.555412 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.555425 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.555434 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.657214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.657247 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.657255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.657268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.657277 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.758830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.758899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.758911 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.758933 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.758950 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.860524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.860559 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.860567 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.860581 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.860590 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.962845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.962879 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.962888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.962899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:15 crc kubenswrapper[4728]: I0125 05:39:15.962908 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:15Z","lastTransitionTime":"2026-01-25T05:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.064395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.064447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.064458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.064474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.064487 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.166240 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.166276 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.166283 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.166296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.166305 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.268174 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.268213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.268225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.268241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.268253 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.318754 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 02:04:34.722800278 +0000 UTC Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.328202 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:16 crc kubenswrapper[4728]: E0125 05:39:16.328309 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.369574 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.369595 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.369603 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.369613 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.369622 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.471837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.471865 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.471874 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.471887 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.471897 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.573401 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.573435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.573447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.573460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.573471 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.674846 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.674867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.674874 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.674886 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.674894 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.775901 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.775922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.775930 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.775939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.775946 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.877155 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.877181 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.877191 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.877201 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.877207 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.978898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.978925 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.978934 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.978945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:16 crc kubenswrapper[4728]: I0125 05:39:16.978954 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:16Z","lastTransitionTime":"2026-01-25T05:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.080410 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.080433 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.080441 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.080451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.080463 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.181926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.181946 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.181954 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.181962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.181970 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.283676 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.283695 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.283704 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.283714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.283722 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.319568 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 11:50:35.651444227 +0000 UTC Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.328290 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:17 crc kubenswrapper[4728]: E0125 05:39:17.328405 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.328455 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:17 crc kubenswrapper[4728]: E0125 05:39:17.328571 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.328628 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:17 crc kubenswrapper[4728]: E0125 05:39:17.328784 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.385106 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.385138 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.385147 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.385158 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.385169 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.487857 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.487898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.487907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.487922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.487933 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.589568 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.589628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.589639 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.589673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.589684 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.691602 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.691632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.691642 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.691674 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.691684 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.783003 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.791852 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.793138 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.793517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.793543 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.793553 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.793566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.793575 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.806352 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.813669 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.823133 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.831015 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.839519 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.849673 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.858118 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.865707 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.872908 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.879157 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.886914 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.895513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.895540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.895552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.895565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.895574 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.896915 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.904726 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.912094 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.919714 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.931012 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:17Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.997384 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.997435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.997445 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.997462 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:17 crc kubenswrapper[4728]: I0125 05:39:17.997491 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:17Z","lastTransitionTime":"2026-01-25T05:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.099711 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.099740 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.099748 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.099769 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.099778 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.201517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.201545 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.201554 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.201564 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.201572 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.302869 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.302901 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.302910 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.302922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.302931 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.320211 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 04:44:38.096798559 +0000 UTC Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.328628 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:18 crc kubenswrapper[4728]: E0125 05:39:18.328974 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.329189 4728 scope.go:117] "RemoveContainer" containerID="f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.404314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.404485 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.404494 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.404508 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.404516 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.506531 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.506565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.506574 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.506587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.506613 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.557583 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/1.log" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.559580 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.560008 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.569110 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.583795 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.599250 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.609579 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.609608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.609616 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.609630 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.609639 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.610707 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.620570 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.627556 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.635857 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.644936 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.653084 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.661183 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.669017 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.676519 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.684552 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.692372 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.703887 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.712032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.712068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.712092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.712108 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.712117 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.730561 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.739369 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.746407 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:18Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.814202 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.814241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.814250 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.814264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.814274 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.916436 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.916468 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.916477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.916489 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:18 crc kubenswrapper[4728]: I0125 05:39:18.916498 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:18Z","lastTransitionTime":"2026-01-25T05:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.018723 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.018755 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.018776 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.018792 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.018800 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.120307 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.120348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.120358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.120369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.120377 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.174925 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.174973 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.174993 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.175024 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.175049 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175096 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:39:51.175077558 +0000 UTC m=+82.210955548 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175121 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175148 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175162 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:51.175153682 +0000 UTC m=+82.211031662 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175123 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175179 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:51.175170373 +0000 UTC m=+82.211048363 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175187 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175200 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175221 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:51.175214807 +0000 UTC m=+82.211092786 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175255 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175292 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175308 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.175377 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-25 05:39:51.175361312 +0000 UTC m=+82.211239292 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.221549 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.221572 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.221580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.221591 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.221599 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.320722 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 20:55:14.851205359 +0000 UTC Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.322940 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.322970 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.322980 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.322994 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.323003 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.328349 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.328373 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.328450 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.328465 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.328534 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.328585 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.341701 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.353444 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.361268 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.368070 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.374014 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.387462 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.407991 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.425264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.425550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.425622 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.425698 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.425339 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.425755 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.433995 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.441449 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.449971 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.459222 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.467052 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.474712 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.481184 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.489816 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.499829 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.509590 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.527456 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.527492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.527502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.527516 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.527526 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.563557 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/2.log" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.564048 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/1.log" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.565851 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7" exitCode=1 Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.565885 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.565917 4728 scope.go:117] "RemoveContainer" containerID="f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.566388 4728 scope.go:117] "RemoveContainer" containerID="531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7" Jan 25 05:39:19 crc kubenswrapper[4728]: E0125 05:39:19.566533 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.574283 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.582407 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.590906 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.598917 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.607208 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.615681 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.623929 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.629452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.629477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.629488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.629501 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.629511 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.631483 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.644540 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12889a34b26720c4a8e4cac31cb0394e92dd50969a02ec8a8cb7bf52871c1b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:00Z\\\",\\\"message\\\":\\\":73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0125 05:39:00.194145 6148 services_controller.go:445] Built service openshift-machine-api/machine-api-operator-webhook LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:00.194151 6148 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:00Z is after 2025-08-24T17:21:41Z]\\\\nI0125 05:39:00.194156 6148 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-webhook cluster-wide L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:18Z\\\",\\\"message\\\":\\\"y object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926878 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0125 05:39:18.926884 6385 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0125 05:39:18.926886 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926889 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0125 05:39:18.926891 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI0125 05:39:18.926894 6385 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 in node crc\\\\nI0125 05:39:18.926900 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 after 0 failed attempt(s)\\\\nI0125 05:39:18.926901 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0125 05:39:18.926905 6385 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.659071 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.666568 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.673504 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.683481 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.690713 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.699459 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.707277 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.714973 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.725050 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:19Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.731431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.731457 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.731465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.731478 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.731486 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.833444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.833587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.833669 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.833823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.833884 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.935271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.935492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.935550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.935612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:19 crc kubenswrapper[4728]: I0125 05:39:19.935681 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:19Z","lastTransitionTime":"2026-01-25T05:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.037038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.037362 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.037426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.037497 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.037563 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.139810 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.139837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.139845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.139856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.139868 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.241501 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.241540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.241549 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.241560 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.241568 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.283207 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.283415 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.283470 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs podName:accc0eb5-6067-4ab9-bbab-6d2ae898942f nodeName:}" failed. No retries permitted until 2026-01-25 05:39:36.283456904 +0000 UTC m=+67.319334884 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs") pod "network-metrics-daemon-k5pj4" (UID: "accc0eb5-6067-4ab9-bbab-6d2ae898942f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.290527 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.290578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.290589 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.290606 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.290617 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.299216 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.301452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.301474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.301483 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.301494 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.301504 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.309591 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.311765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.311788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.311799 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.311809 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.311819 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.319451 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.320856 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 03:37:50.409885063 +0000 UTC Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.321576 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.321597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.321606 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.321615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.321622 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.328215 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.328372 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.329003 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.330903 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.330926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.330935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.330945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.330951 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.338631 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.338868 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.342627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.342650 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.342658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.342669 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.342676 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.443891 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.444001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.444089 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.444162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.444223 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.546218 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.546267 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.546276 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.546290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.546300 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.569630 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/2.log" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.572651 4728 scope.go:117] "RemoveContainer" containerID="531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7" Jan 25 05:39:20 crc kubenswrapper[4728]: E0125 05:39:20.572808 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.583209 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.591185 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.598019 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.606741 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.615383 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.625235 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.633597 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.640097 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.647983 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.648028 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.648040 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.648054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.648064 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.648660 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.657230 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.665086 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.673487 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.681794 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.690446 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.702739 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:18Z\\\",\\\"message\\\":\\\"y object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926878 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0125 05:39:18.926884 6385 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0125 05:39:18.926886 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926889 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0125 05:39:18.926891 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI0125 05:39:18.926894 6385 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 in node crc\\\\nI0125 05:39:18.926900 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 after 0 failed attempt(s)\\\\nI0125 05:39:18.926901 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0125 05:39:18.926905 6385 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.708899 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.721405 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.728251 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:20Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.749893 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.749976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.750054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.750124 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.750186 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.851845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.851888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.851899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.851912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.851921 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.953200 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.953370 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.953458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.953526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:20 crc kubenswrapper[4728]: I0125 05:39:20.953591 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:20Z","lastTransitionTime":"2026-01-25T05:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.055144 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.055394 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.055489 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.055568 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.055622 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.157481 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.157506 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.157514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.157526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.157535 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.259490 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.259536 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.259547 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.259562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.259574 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.322052 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 16:48:59.800545274 +0000 UTC Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.328468 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.328489 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:21 crc kubenswrapper[4728]: E0125 05:39:21.328575 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.328472 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:21 crc kubenswrapper[4728]: E0125 05:39:21.328769 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:21 crc kubenswrapper[4728]: E0125 05:39:21.328861 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.361278 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.361300 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.361308 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.361348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.361358 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.463345 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.463375 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.463384 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.463399 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.463409 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.565241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.565491 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.565501 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.565516 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.565526 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.667179 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.667211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.667220 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.667234 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.667244 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.769041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.769061 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.769071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.769082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.769090 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.871025 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.871048 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.871056 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.871066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.871073 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.973044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.973072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.973081 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.973093 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:21 crc kubenswrapper[4728]: I0125 05:39:21.973100 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:21Z","lastTransitionTime":"2026-01-25T05:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.075583 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.075609 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.075617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.075628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.075637 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.177591 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.177672 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.177683 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.177715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.177725 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.280127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.280175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.280184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.280199 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.280208 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.322755 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 04:55:58.404842726 +0000 UTC Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.327999 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:22 crc kubenswrapper[4728]: E0125 05:39:22.328091 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.381807 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.381837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.381845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.381861 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.381872 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.483918 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.483946 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.483956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.483969 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.483978 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.585460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.585490 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.585508 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.585520 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.585530 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.687724 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.687777 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.687788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.687800 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.687821 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.789248 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.789283 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.789291 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.789305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.789314 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.891261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.891298 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.891307 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.891334 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.891345 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.992968 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.992996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.993005 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.993014 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:22 crc kubenswrapper[4728]: I0125 05:39:22.993021 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:22Z","lastTransitionTime":"2026-01-25T05:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.094986 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.095027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.095035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.095050 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.095060 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.196779 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.196805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.196813 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.196824 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.196832 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.298599 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.298628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.298637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.298647 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.298657 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.323255 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:21:17.282411959 +0000 UTC Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.328546 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.328562 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.328573 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:23 crc kubenswrapper[4728]: E0125 05:39:23.328652 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:23 crc kubenswrapper[4728]: E0125 05:39:23.328738 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:23 crc kubenswrapper[4728]: E0125 05:39:23.328825 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.400313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.400358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.400369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.400380 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.400388 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.502586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.502615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.502625 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.502640 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.502654 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.604717 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.604742 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.604762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.604772 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.604780 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.706534 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.706557 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.706566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.706575 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.706583 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.808436 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.808501 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.808519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.808539 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.808559 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.910264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.910303 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.910315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.910356 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:23 crc kubenswrapper[4728]: I0125 05:39:23.910369 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:23Z","lastTransitionTime":"2026-01-25T05:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.012223 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.012267 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.012280 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.012295 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.012310 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.114133 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.114173 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.114184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.114200 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.114211 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.215607 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.215633 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.215642 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.215652 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.215659 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.317467 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.317494 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.317501 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.317511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.317520 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.323919 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 07:04:26.202758934 +0000 UTC Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.328153 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:24 crc kubenswrapper[4728]: E0125 05:39:24.328240 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.419598 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.419628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.419637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.419648 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.419658 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.521366 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.521397 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.521406 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.521419 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.521429 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.623335 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.623366 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.623375 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.623384 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.623407 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.725266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.725299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.725308 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.725340 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.725350 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.827128 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.827162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.827171 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.827184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.827195 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.928976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.929004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.929013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.929024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:24 crc kubenswrapper[4728]: I0125 05:39:24.929034 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:24Z","lastTransitionTime":"2026-01-25T05:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.031036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.031068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.031077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.031090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.031099 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.132970 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.133011 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.133022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.133036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.133046 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.234401 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.234435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.234444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.234458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.234467 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.324281 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:32:41.338814929 +0000 UTC Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.328522 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.328554 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:25 crc kubenswrapper[4728]: E0125 05:39:25.328634 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.328720 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:25 crc kubenswrapper[4728]: E0125 05:39:25.328831 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:25 crc kubenswrapper[4728]: E0125 05:39:25.328944 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.336429 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.336455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.336465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.336475 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.336483 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.437727 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.437769 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.437778 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.437790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.437798 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.539917 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.539939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.539950 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.539967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.539974 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.641097 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.641123 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.641132 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.641143 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.641152 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.742479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.742510 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.742518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.742526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.742533 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.844484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.844514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.844525 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.844537 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.844546 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.946566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.946616 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.946626 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.946639 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:25 crc kubenswrapper[4728]: I0125 05:39:25.946647 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:25Z","lastTransitionTime":"2026-01-25T05:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.048187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.048304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.048531 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.048595 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.048655 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.150498 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.150616 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.150690 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.150764 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.150827 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.252724 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.252762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.252772 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.252782 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.252790 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.325198 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 06:54:18.204270728 +0000 UTC Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.328234 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:26 crc kubenswrapper[4728]: E0125 05:39:26.328399 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.354195 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.354221 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.354230 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.354241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.354248 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.456268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.456305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.456314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.456357 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.456370 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.558066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.558090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.558099 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.558112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.558119 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.659219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.659255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.659264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.659277 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.659305 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.761637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.761664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.761673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.761681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.761689 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.863654 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.863689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.863697 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.863710 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.863718 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.965875 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.965899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.965912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.965921 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:26 crc kubenswrapper[4728]: I0125 05:39:26.965929 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:26Z","lastTransitionTime":"2026-01-25T05:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.067214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.067241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.067254 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.067268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.067277 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.169296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.169358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.169368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.169380 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.169389 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.270719 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.270752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.270761 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.270771 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.270781 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.326363 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 13:11:44.321346828 +0000 UTC Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.328637 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.328637 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.328678 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:27 crc kubenswrapper[4728]: E0125 05:39:27.328767 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:27 crc kubenswrapper[4728]: E0125 05:39:27.328800 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:27 crc kubenswrapper[4728]: E0125 05:39:27.328838 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.372135 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.372166 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.372176 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.372186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.372195 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.474221 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.474263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.474277 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.474293 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.474306 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.575878 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.575905 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.575914 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.575925 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.575934 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.677535 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.677577 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.677589 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.677603 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.677613 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.778580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.778605 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.778614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.778622 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.778630 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.880130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.880159 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.880168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.880179 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.880186 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.982474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.982508 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.982521 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.982535 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:27 crc kubenswrapper[4728]: I0125 05:39:27.982546 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:27Z","lastTransitionTime":"2026-01-25T05:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.084550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.084587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.084596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.084610 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.084619 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.186518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.186550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.186558 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.186568 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.186576 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.287802 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.287844 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.287855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.287871 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.287881 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.327369 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 23:05:17.604399993 +0000 UTC Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.328536 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:28 crc kubenswrapper[4728]: E0125 05:39:28.328629 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.389378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.389415 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.389425 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.389435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.389443 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.491361 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.491393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.491403 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.491416 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.491425 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.592858 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.592882 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.592890 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.592900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.592908 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.694894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.694931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.694940 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.694953 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.694963 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.796165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.796194 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.796202 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.796211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.796218 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.897874 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.897896 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.897904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.897914 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:28 crc kubenswrapper[4728]: I0125 05:39:28.897921 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:28.999893 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:28.999922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:28.999933 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:28.999946 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:28.999955 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:28Z","lastTransitionTime":"2026-01-25T05:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.101570 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.101594 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.101601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.101612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.101619 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.202976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.203010 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.203019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.203029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.203037 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.305136 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.305162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.305170 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.305180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.305188 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.327831 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 15:51:15.85829725 +0000 UTC Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.327909 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.327934 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.328010 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:29 crc kubenswrapper[4728]: E0125 05:39:29.328109 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:29 crc kubenswrapper[4728]: E0125 05:39:29.328184 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:29 crc kubenswrapper[4728]: E0125 05:39:29.328259 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.340515 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.348766 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.361021 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:18Z\\\",\\\"message\\\":\\\"y object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926878 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0125 05:39:18.926884 6385 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0125 05:39:18.926886 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926889 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0125 05:39:18.926891 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI0125 05:39:18.926894 6385 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 in node crc\\\\nI0125 05:39:18.926900 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 after 0 failed attempt(s)\\\\nI0125 05:39:18.926901 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0125 05:39:18.926905 6385 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.374435 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.382659 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.389982 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.399461 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.406299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.406341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.406351 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.406364 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.406373 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.408050 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.416980 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.426061 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.433213 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.439841 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.449048 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.456502 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.464383 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.471472 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.477518 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.485793 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:29Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.508196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.508227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.508236 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.508250 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.508259 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.610024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.610063 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.610070 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.610083 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.610095 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.712083 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.712111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.712119 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.712132 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.712140 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.814240 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.814272 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.814281 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.814293 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.814301 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.915986 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.916020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.916029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.916040 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:29 crc kubenswrapper[4728]: I0125 05:39:29.916049 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:29Z","lastTransitionTime":"2026-01-25T05:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.017926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.017958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.017967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.017980 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.017988 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.119410 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.119452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.119460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.119477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.119488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.221421 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.221450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.221460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.221472 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.221479 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.323675 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.323713 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.323723 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.323747 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.323757 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.327915 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 01:15:46.751227705 +0000 UTC Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.328022 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:30 crc kubenswrapper[4728]: E0125 05:39:30.328119 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.425419 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.425458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.425467 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.425482 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.425492 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.475139 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.475259 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.475343 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.475422 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.475487 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: E0125 05:39:30.486344 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:30Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.488855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.488897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.488908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.488922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.488931 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: E0125 05:39:30.496707 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:30Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.498962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.498992 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.499001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.499011 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.499018 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: E0125 05:39:30.507172 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:30Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.509305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.509359 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.509369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.509380 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.509388 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: E0125 05:39:30.517293 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:30Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.519291 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.519342 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.519353 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.519362 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.519369 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: E0125 05:39:30.527111 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:30Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:30 crc kubenswrapper[4728]: E0125 05:39:30.527211 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.528196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.528241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.528253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.528263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.528272 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.630051 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.630080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.630090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.630100 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.630126 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.731260 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.731289 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.731298 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.731308 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.731315 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.832524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.832553 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.832561 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.832571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.832580 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.934528 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.934559 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.934567 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.934581 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:30 crc kubenswrapper[4728]: I0125 05:39:30.934590 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:30Z","lastTransitionTime":"2026-01-25T05:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.036483 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.036523 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.036533 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.036552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.036564 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.139216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.139244 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.139254 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.139263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.139285 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.240436 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.240464 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.240472 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.240481 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.240488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.327945 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:31 crc kubenswrapper[4728]: E0125 05:39:31.328469 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.327987 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:16:29.217872533 +0000 UTC Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.328014 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:31 crc kubenswrapper[4728]: E0125 05:39:31.328707 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.327960 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:31 crc kubenswrapper[4728]: E0125 05:39:31.328893 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.341772 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.341804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.341814 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.341829 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.341839 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.443935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.443996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.444009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.444026 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.444035 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.545945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.546253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.546339 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.546421 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.546481 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.648290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.648333 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.648344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.648356 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.648363 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.750313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.750388 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.750405 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.750423 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.750436 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.851725 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.851827 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.851902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.851969 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.852034 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.953226 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.953252 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.953261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.953274 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:31 crc kubenswrapper[4728]: I0125 05:39:31.953281 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:31Z","lastTransitionTime":"2026-01-25T05:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.055158 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.055196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.055210 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.055229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.055240 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.156658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.156686 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.156694 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.156706 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.156716 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.257907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.257939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.257948 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.257961 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.257969 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.328674 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:22:08.659895252 +0000 UTC Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.328702 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.329205 4728 scope.go:117] "RemoveContainer" containerID="531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7" Jan 25 05:39:32 crc kubenswrapper[4728]: E0125 05:39:32.329295 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:32 crc kubenswrapper[4728]: E0125 05:39:32.329414 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.359912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.359941 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.359950 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.359963 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.359971 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.462404 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.462442 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.462451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.462464 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.462473 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.564339 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.564372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.564381 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.564395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.564405 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.666784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.666911 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.666971 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.667031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.667099 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.768815 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.768857 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.768867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.768886 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.768895 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.870630 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.870666 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.870675 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.870688 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.870697 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.972518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.972618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.972693 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.972763 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:32 crc kubenswrapper[4728]: I0125 05:39:32.972829 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:32Z","lastTransitionTime":"2026-01-25T05:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.075378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.075488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.075554 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.075624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.075677 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.177579 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.177634 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.177648 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.177670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.177688 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.279677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.279720 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.279741 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.279760 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.279771 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.328560 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.328926 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.328935 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:33 crc kubenswrapper[4728]: E0125 05:39:33.329034 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.329148 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 05:27:59.637932621 +0000 UTC Jan 25 05:39:33 crc kubenswrapper[4728]: E0125 05:39:33.329659 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:33 crc kubenswrapper[4728]: E0125 05:39:33.329660 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.381922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.381956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.381969 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.381984 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.381995 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.483921 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.483965 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.483976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.483987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.483996 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.585483 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.585562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.585577 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.585593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.585603 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.687257 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.687303 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.687331 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.687347 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.687360 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.789840 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.789876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.789885 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.789900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.789913 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.891470 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.891500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.891509 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.891519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.891531 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.993981 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.994027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.994037 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.994051 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:33 crc kubenswrapper[4728]: I0125 05:39:33.994060 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:33Z","lastTransitionTime":"2026-01-25T05:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.095675 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.095710 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.095720 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.095744 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.095754 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.197593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.197627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.197636 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.197647 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.197655 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.299399 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.299432 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.299441 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.299450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.299459 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.328771 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:34 crc kubenswrapper[4728]: E0125 05:39:34.328858 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.330069 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 20:37:08.921589776 +0000 UTC Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.400578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.400605 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.400614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.400642 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.400654 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.502315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.502358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.502368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.502379 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.502388 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.603957 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.603987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.603997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.604010 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.604018 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.705500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.705546 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.705556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.705566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.705575 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.807608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.807715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.807796 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.807864 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.807925 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.909267 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.909294 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.909303 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.909337 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:34 crc kubenswrapper[4728]: I0125 05:39:34.909347 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:34Z","lastTransitionTime":"2026-01-25T05:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.010761 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.010794 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.010803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.010816 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.010824 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.112542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.112573 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.112585 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.112598 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.112608 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.217010 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.217044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.217054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.217137 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.217178 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.319865 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.319899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.319910 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.319923 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.319934 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.328340 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.328352 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.328338 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:35 crc kubenswrapper[4728]: E0125 05:39:35.328446 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:35 crc kubenswrapper[4728]: E0125 05:39:35.328532 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:35 crc kubenswrapper[4728]: E0125 05:39:35.328626 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.330509 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 14:14:20.775141994 +0000 UTC Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.422177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.422206 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.422215 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.422253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.422278 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.524258 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.524293 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.524302 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.524316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.524354 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.625756 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.625787 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.625796 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.625805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.625813 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.727864 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.727888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.727897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.727908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.727915 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.829371 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.829398 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.829408 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.829418 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.829426 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.931754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.931802 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.931811 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.931823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:35 crc kubenswrapper[4728]: I0125 05:39:35.931833 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:35Z","lastTransitionTime":"2026-01-25T05:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.033605 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.033632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.033640 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.033651 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.033659 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.135312 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.135356 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.135364 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.135373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.135382 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.236835 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.236864 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.236875 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.236885 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.236892 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.311730 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:36 crc kubenswrapper[4728]: E0125 05:39:36.311876 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:36 crc kubenswrapper[4728]: E0125 05:39:36.311937 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs podName:accc0eb5-6067-4ab9-bbab-6d2ae898942f nodeName:}" failed. No retries permitted until 2026-01-25 05:40:08.311922464 +0000 UTC m=+99.347800444 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs") pod "network-metrics-daemon-k5pj4" (UID: "accc0eb5-6067-4ab9-bbab-6d2ae898942f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.327762 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:36 crc kubenswrapper[4728]: E0125 05:39:36.327880 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.330816 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:55:10.997652532 +0000 UTC Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.338971 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.338999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.339009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.339022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.339033 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.441088 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.441117 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.441127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.441139 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.441149 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.542873 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.542904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.542912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.542925 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.542933 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.644760 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.644787 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.644795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.644810 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.644818 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.747016 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.747046 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.747055 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.747066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.747076 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.848303 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.848348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.848356 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.848367 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.848377 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.949972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.950006 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.950014 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.950029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:36 crc kubenswrapper[4728]: I0125 05:39:36.950038 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:36Z","lastTransitionTime":"2026-01-25T05:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.051670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.051693 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.051702 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.051713 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.051733 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.153447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.153474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.153482 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.153492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.153500 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.255340 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.255371 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.255379 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.255390 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.255399 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.328009 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:37 crc kubenswrapper[4728]: E0125 05:39:37.328093 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.328199 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:37 crc kubenswrapper[4728]: E0125 05:39:37.328345 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.328207 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:37 crc kubenswrapper[4728]: E0125 05:39:37.328525 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.330879 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 03:15:45.22574036 +0000 UTC Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.356842 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.356937 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.356993 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.357061 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.357126 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.458727 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.458765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.458790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.458806 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.458816 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.560285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.560306 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.560335 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.560347 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.560355 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.608792 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/0.log" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.609121 4728 generic.go:334] "Generic (PLEG): container finished" podID="c2ffc038-3d70-4d2c-b150-e8529f622238" containerID="9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5" exitCode=1 Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.609160 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdxw7" event={"ID":"c2ffc038-3d70-4d2c-b150-e8529f622238","Type":"ContainerDied","Data":"9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.609463 4728 scope.go:117] "RemoveContainer" containerID="9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.620290 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.629088 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.637240 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.644659 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.650884 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.659392 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.662165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.662197 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.662207 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.662238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.662247 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.667504 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.679812 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.692512 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:18Z\\\",\\\"message\\\":\\\"y object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926878 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0125 05:39:18.926884 6385 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0125 05:39:18.926886 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926889 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0125 05:39:18.926891 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI0125 05:39:18.926894 6385 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 in node crc\\\\nI0125 05:39:18.926900 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 after 0 failed attempt(s)\\\\nI0125 05:39:18.926901 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0125 05:39:18.926905 6385 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.705213 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.713640 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.720710 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.729499 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.738158 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.746496 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.756066 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.763504 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.763973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.764003 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.764013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.764026 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.764035 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.770809 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:37Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.866375 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.866422 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.866432 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.866446 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.866473 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.968805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.968847 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.968857 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.968874 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:37 crc kubenswrapper[4728]: I0125 05:39:37.968885 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:37Z","lastTransitionTime":"2026-01-25T05:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.072031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.072075 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.072090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.072112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.072182 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.173752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.173784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.173795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.173811 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.173822 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.276345 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.276374 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.276386 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.276398 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.276408 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.327996 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:38 crc kubenswrapper[4728]: E0125 05:39:38.328124 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.331216 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 18:53:30.768940334 +0000 UTC Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.378013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.378136 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.378207 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.378269 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.378354 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.479860 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.479885 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.479893 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.479902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.479910 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.581637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.581656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.581664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.581678 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.581685 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.613781 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/0.log" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.613832 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdxw7" event={"ID":"c2ffc038-3d70-4d2c-b150-e8529f622238","Type":"ContainerStarted","Data":"18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.623421 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.635084 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.644344 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.652129 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.661144 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.684213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.684400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.684415 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.684431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.684439 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.688613 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.705961 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.721540 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.731439 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.742148 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.757220 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.766944 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.781374 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:18Z\\\",\\\"message\\\":\\\"y object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926878 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0125 05:39:18.926884 6385 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0125 05:39:18.926886 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926889 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0125 05:39:18.926891 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI0125 05:39:18.926894 6385 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 in node crc\\\\nI0125 05:39:18.926900 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 after 0 failed attempt(s)\\\\nI0125 05:39:18.926901 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0125 05:39:18.926905 6385 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.789127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.789155 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.789167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.789180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.789189 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.792737 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.803822 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.812947 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.820458 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.839940 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:38Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.891574 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.891600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.891610 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.891620 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.891629 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.994044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.994097 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.994109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.994126 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:38 crc kubenswrapper[4728]: I0125 05:39:38.994136 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:38Z","lastTransitionTime":"2026-01-25T05:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.095974 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.096028 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.096042 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.096065 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.096080 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.198111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.198148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.198161 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.198173 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.198182 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.300331 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.300376 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.300418 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.300430 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.300440 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.327995 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.328031 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:39 crc kubenswrapper[4728]: E0125 05:39:39.328096 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:39 crc kubenswrapper[4728]: E0125 05:39:39.328176 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.328041 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:39 crc kubenswrapper[4728]: E0125 05:39:39.328263 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.331288 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:49:35.662887676 +0000 UTC Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.343975 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:18Z\\\",\\\"message\\\":\\\"y object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926878 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0125 05:39:18.926884 6385 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0125 05:39:18.926886 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926889 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0125 05:39:18.926891 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI0125 05:39:18.926894 6385 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 in node crc\\\\nI0125 05:39:18.926900 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 after 0 failed attempt(s)\\\\nI0125 05:39:18.926901 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0125 05:39:18.926905 6385 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.353791 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.362181 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.369743 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.376738 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.390174 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.402335 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.402368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.402379 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.402396 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.402407 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.404158 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.414051 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.422450 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.429453 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.438507 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.446443 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.454434 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.461841 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.470220 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.478746 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.487915 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.496361 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:39Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.504687 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.504712 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.504732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.504750 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.504763 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.610560 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.610610 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.610619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.610635 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.610647 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.713475 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.713499 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.713507 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.713519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.713530 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.816075 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.816187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.816253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.816345 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.816416 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.917729 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.917754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.917765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.917777 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:39 crc kubenswrapper[4728]: I0125 05:39:39.917786 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:39Z","lastTransitionTime":"2026-01-25T05:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.020922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.020951 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.020961 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.020974 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.020983 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.123410 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.123435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.123446 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.123460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.123489 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.224863 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.224891 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.224923 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.224938 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.224949 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.327413 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.327529 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.327593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.327654 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.327708 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.327770 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:40 crc kubenswrapper[4728]: E0125 05:39:40.327981 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.331665 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 19:25:35.578512405 +0000 UTC Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.430151 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.430243 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.430316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.430399 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.430460 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.531926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.531956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.531966 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.532024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.532033 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.633682 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.633725 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.633736 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.633750 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.633761 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.735316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.735356 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.735365 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.735375 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.735401 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.743263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.743285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.743295 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.743307 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.743315 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: E0125 05:39:40.752186 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:40Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.754764 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.754803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.754812 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.754823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.754832 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: E0125 05:39:40.762602 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:40Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.764958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.764979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.764987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.764997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.765006 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: E0125 05:39:40.773567 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:40Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.776335 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.776358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.776369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.776381 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.776389 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: E0125 05:39:40.785059 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:40Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.787486 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.787514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.787525 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.787536 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.787550 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: E0125 05:39:40.796152 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:40Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:40 crc kubenswrapper[4728]: E0125 05:39:40.796278 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.837080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.837110 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.837119 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.837130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.837139 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.938970 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.939014 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.939023 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.939033 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:40 crc kubenswrapper[4728]: I0125 05:39:40.939040 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:40Z","lastTransitionTime":"2026-01-25T05:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.040600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.040633 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.040644 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.040655 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.040662 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.142444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.142479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.142489 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.142503 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.142512 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.244016 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.244042 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.244052 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.244065 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.244073 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.329769 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:41 crc kubenswrapper[4728]: E0125 05:39:41.329922 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.330179 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:41 crc kubenswrapper[4728]: E0125 05:39:41.330310 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.330480 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:41 crc kubenswrapper[4728]: E0125 05:39:41.330556 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.332386 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 05:21:49.764942513 +0000 UTC Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.345296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.345332 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.345343 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.345354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.345362 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.447015 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.447048 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.447057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.447072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.447100 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.548705 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.548758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.548770 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.548780 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.548789 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.650979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.651013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.651022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.651036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.651046 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.752805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.752841 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.752851 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.752865 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.752874 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.854759 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.854788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.854798 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.854809 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.854825 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.956564 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.956591 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.956601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.956614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:41 crc kubenswrapper[4728]: I0125 05:39:41.956626 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:41Z","lastTransitionTime":"2026-01-25T05:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.058437 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.058472 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.058482 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.058496 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.058506 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.160230 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.160266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.160275 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.160289 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.160298 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.262450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.262486 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.262496 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.262511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.262520 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.328677 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:42 crc kubenswrapper[4728]: E0125 05:39:42.328789 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.333030 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 11:53:10.918672795 +0000 UTC Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.364664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.364718 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.364752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.364765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.364774 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.466462 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.466492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.466503 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.466514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.466522 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.568182 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.568210 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.568219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.568229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.568237 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.670134 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.670163 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.670172 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.670183 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.670191 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.772279 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.772359 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.772369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.772382 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.772390 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.874514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.874542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.874552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.874562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.874569 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.976315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.976358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.976367 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.976379 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:42 crc kubenswrapper[4728]: I0125 05:39:42.976390 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:42Z","lastTransitionTime":"2026-01-25T05:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.078281 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.078309 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.078332 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.078344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.078353 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.179497 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.179526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.179535 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.179546 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.179554 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.281697 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.281732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.281745 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.281754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.281765 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.328590 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:43 crc kubenswrapper[4728]: E0125 05:39:43.328748 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.328785 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.328819 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:43 crc kubenswrapper[4728]: E0125 05:39:43.328877 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:43 crc kubenswrapper[4728]: E0125 05:39:43.328975 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.333422 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 23:19:45.449712811 +0000 UTC Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.340284 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.383022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.383049 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.383058 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.383072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.383081 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.484597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.484624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.484633 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.484645 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.484658 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.586044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.586081 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.586089 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.586103 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.586112 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.687710 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.687754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.687763 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.687772 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.687779 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.789918 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.789949 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.789957 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.789968 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.789980 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.894036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.894147 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.894221 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.894288 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.894383 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.995977 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.996016 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.996024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.996037 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:43 crc kubenswrapper[4728]: I0125 05:39:43.996046 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:43Z","lastTransitionTime":"2026-01-25T05:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.098147 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.098186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.098201 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.098215 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.098226 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.200312 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.200349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.200357 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.200368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.200375 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.302295 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.302339 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.302348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.302359 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.302369 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.327938 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:44 crc kubenswrapper[4728]: E0125 05:39:44.328038 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.334420 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 04:49:54.505384389 +0000 UTC Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.403765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.403795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.403804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.403815 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.403822 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.505866 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.505897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.505906 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.505917 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.505928 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.607255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.607282 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.607290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.607300 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.607313 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.709123 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.709154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.709166 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.709178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.709187 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.810670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.810696 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.810705 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.810715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.810722 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.912561 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.912590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.912604 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.912614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:44 crc kubenswrapper[4728]: I0125 05:39:44.912623 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:44Z","lastTransitionTime":"2026-01-25T05:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.014140 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.014165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.014173 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.014184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.014191 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.115861 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.115883 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.115890 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.115900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.115907 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.217776 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.217805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.217814 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.217823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.217830 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.319382 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.319411 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.319419 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.319427 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.319436 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.328491 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.328516 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:45 crc kubenswrapper[4728]: E0125 05:39:45.328595 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.328744 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:45 crc kubenswrapper[4728]: E0125 05:39:45.328843 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:45 crc kubenswrapper[4728]: E0125 05:39:45.328957 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.335241 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 14:05:54.117996082 +0000 UTC Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.420883 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.420909 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.420919 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.420929 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.420937 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.522854 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.522884 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.522892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.522903 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.522912 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.624187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.624219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.624228 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.624238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.624246 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.726084 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.726130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.726140 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.726154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.726166 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.828146 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.828186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.828198 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.828213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.828223 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.930371 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.930403 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.930411 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.930421 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:45 crc kubenswrapper[4728]: I0125 05:39:45.930430 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:45Z","lastTransitionTime":"2026-01-25T05:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.032793 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.032823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.032834 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.032844 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.032851 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.134630 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.134688 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.134699 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.134719 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.134742 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.236033 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.236069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.236083 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.236097 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.236112 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.328313 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:46 crc kubenswrapper[4728]: E0125 05:39:46.328438 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.328872 4728 scope.go:117] "RemoveContainer" containerID="531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.336195 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 01:09:35.325225858 +0000 UTC Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.341615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.341669 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.341685 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.341705 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.341726 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.443242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.443276 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.443286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.443299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.443309 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.544770 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.544832 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.544842 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.544862 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.544873 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.634873 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/2.log" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.637702 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.638118 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.646570 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.646597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.646605 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.646617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.646627 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.649102 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41738147-157e-4427-87e1-8c96482b330c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f72c469e82b896507fa17efcce786a2a6851270684c72877a47d0179b5ac020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.669174 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.678478 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.685486 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.695878 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.704659 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.712443 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.721598 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.728723 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.736117 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.745283 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.748349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.748377 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.748388 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.748404 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.748414 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.756708 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.765273 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.774094 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.781600 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.789749 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.798226 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.806473 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.819412 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:18Z\\\",\\\"message\\\":\\\"y object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926878 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0125 05:39:18.926884 6385 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0125 05:39:18.926886 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926889 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0125 05:39:18.926891 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI0125 05:39:18.926894 6385 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 in node crc\\\\nI0125 05:39:18.926900 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 after 0 failed attempt(s)\\\\nI0125 05:39:18.926901 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0125 05:39:18.926905 6385 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:46Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.850175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.850206 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.850215 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.850229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.850237 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.952154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.952191 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.952201 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.952217 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:46 crc kubenswrapper[4728]: I0125 05:39:46.952228 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:46Z","lastTransitionTime":"2026-01-25T05:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.054148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.054188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.054198 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.054213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.054223 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.155755 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.155783 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.155791 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.155803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.155812 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.257832 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.257992 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.258065 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.258124 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.258176 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.327971 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.328005 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:47 crc kubenswrapper[4728]: E0125 05:39:47.328254 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.328042 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:47 crc kubenswrapper[4728]: E0125 05:39:47.328132 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:47 crc kubenswrapper[4728]: E0125 05:39:47.328387 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.337006 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 06:17:05.285361018 +0000 UTC Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.359824 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.359847 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.359856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.359870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.359879 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.461591 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.461683 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.461771 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.461847 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.461906 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.564148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.564170 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.564179 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.564189 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.564197 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.641239 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/3.log" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.641863 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/2.log" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.644269 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" exitCode=1 Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.644302 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.644353 4728 scope.go:117] "RemoveContainer" containerID="531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.644827 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:39:47 crc kubenswrapper[4728]: E0125 05:39:47.644973 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.655289 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.663789 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.665927 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.665962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.665973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.665990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.666001 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.672920 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.681761 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.689969 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.697935 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.706390 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.713898 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.726808 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://531b4b4b10228d36eb1d7063f29d3822bb3d31743d32a2fc6e0db4ef7c2e25a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:18Z\\\",\\\"message\\\":\\\"y object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926878 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0125 05:39:18.926884 6385 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0125 05:39:18.926886 6385 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\\nI0125 05:39:18.926889 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0125 05:39:18.926891 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI0125 05:39:18.926894 6385 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 in node crc\\\\nI0125 05:39:18.926900 6385 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9 after 0 failed attempt(s)\\\\nI0125 05:39:18.926901 6385 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0125 05:39:18.926905 6385 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:47Z\\\",\\\"message\\\":\\\"o:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000849 6814 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0125 05:39:47.000853 6814 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0125 05:39:47.000856 6814 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000526 6814 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0125 05:39:47.000551 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0125 05:39:47.000661 6814 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0125 05:39:46.999920 6814 services_controller.go:445] Built service openshift-kube-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:47.000920 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.733239 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.739475 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41738147-157e-4427-87e1-8c96482b330c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f72c469e82b896507fa17efcce786a2a6851270684c72877a47d0179b5ac020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.751589 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.758447 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.767690 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.767912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.767936 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.767964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.767978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.767988 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.774528 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.781049 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.789922 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.797497 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.805920 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:47Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.869401 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.869431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.869462 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.869474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.869484 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.970979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.971009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.971018 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.971047 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:47 crc kubenswrapper[4728]: I0125 05:39:47.971057 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:47Z","lastTransitionTime":"2026-01-25T05:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.072855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.072910 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.072920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.072934 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.072945 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.174457 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.174480 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.174488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.174500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.174509 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.276261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.276290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.276301 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.276315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.276341 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.328512 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:48 crc kubenswrapper[4728]: E0125 05:39:48.328659 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.337730 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 18:22:20.769950104 +0000 UTC Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.378396 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.378422 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.378431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.378443 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.378454 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.480732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.480789 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.480801 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.480816 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.480826 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.582524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.582558 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.582567 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.582580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.582590 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.648555 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/3.log" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.651372 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:39:48 crc kubenswrapper[4728]: E0125 05:39:48.651516 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.662279 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.671754 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.678567 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.684570 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.684607 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.684617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.684630 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.684638 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.689696 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.699158 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.707496 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.719957 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:47Z\\\",\\\"message\\\":\\\"o:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000849 6814 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0125 05:39:47.000853 6814 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0125 05:39:47.000856 6814 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000526 6814 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0125 05:39:47.000551 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0125 05:39:47.000661 6814 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0125 05:39:46.999920 6814 services_controller.go:445] Built service openshift-kube-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:47.000920 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.728684 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.738339 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.745845 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.752040 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.758327 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41738147-157e-4427-87e1-8c96482b330c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f72c469e82b896507fa17efcce786a2a6851270684c72877a47d0179b5ac020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.776870 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.786264 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.786915 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.786946 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.786958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.786972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.786979 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.796146 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.803080 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.809507 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.817788 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.826013 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:48Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.888668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.888706 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.888746 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.888762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.888771 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.990539 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.990575 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.990583 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.990596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:48 crc kubenswrapper[4728]: I0125 05:39:48.990604 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:48Z","lastTransitionTime":"2026-01-25T05:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.092457 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.092498 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.092507 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.092519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.092528 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.194804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.194848 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.194858 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.194878 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.194889 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.296988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.297030 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.297039 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.297054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.297063 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.328666 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:49 crc kubenswrapper[4728]: E0125 05:39:49.328773 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.328818 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.328826 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:49 crc kubenswrapper[4728]: E0125 05:39:49.328883 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:49 crc kubenswrapper[4728]: E0125 05:39:49.328912 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.338086 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 07:47:09.388242631 +0000 UTC Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.340216 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.349689 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.357759 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.365000 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.371891 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.379423 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.387371 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.397875 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.398500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.398529 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.398537 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.398548 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.398559 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.413151 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:47Z\\\",\\\"message\\\":\\\"o:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000849 6814 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0125 05:39:47.000853 6814 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0125 05:39:47.000856 6814 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000526 6814 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0125 05:39:47.000551 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0125 05:39:47.000661 6814 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0125 05:39:46.999920 6814 services_controller.go:445] Built service openshift-kube-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:47.000920 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.419419 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41738147-157e-4427-87e1-8c96482b330c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f72c469e82b896507fa17efcce786a2a6851270684c72877a47d0179b5ac020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.431369 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.438582 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.444854 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.452466 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.459881 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.467100 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.477225 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.487459 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.494394 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:49Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.500616 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.500638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.500646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.500658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.500667 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.602231 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.602253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.602261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.602273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.602297 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.703582 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.703609 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.703617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.703643 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.703652 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.804599 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.804631 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.804640 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.804668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.804677 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.906296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.906341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.906352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.906365 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:49 crc kubenswrapper[4728]: I0125 05:39:49.906373 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:49Z","lastTransitionTime":"2026-01-25T05:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.007973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.008012 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.008022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.008035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.008045 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.109770 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.109808 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.109817 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.109853 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.109861 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.216945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.216970 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.216979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.216995 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.217006 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.318803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.318831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.318842 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.318852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.318860 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.328234 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:50 crc kubenswrapper[4728]: E0125 05:39:50.328346 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.338543 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 01:18:38.259113591 +0000 UTC Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.419969 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.419994 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.420004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.420013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.420021 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.521830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.521854 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.521861 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.521869 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.521875 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.623608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.623651 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.623661 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.623671 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.623678 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.725376 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.725405 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.725414 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.725424 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.725433 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.827392 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.827416 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.827425 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.827434 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.827441 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.929056 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.929077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.929085 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.929094 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:50 crc kubenswrapper[4728]: I0125 05:39:50.929101 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:50Z","lastTransitionTime":"2026-01-25T05:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.030100 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.030124 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.030132 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.030141 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.030148 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.131523 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.131547 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.131556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.131567 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.131574 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.184774 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.185035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.185057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.185070 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.185078 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.194222 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.196619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.196640 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.196649 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.196658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.196665 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.204832 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.206766 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.206797 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.206805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.206815 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.206821 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.214437 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.216452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.216475 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.216483 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.216492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.216499 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.224826 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.224889 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.224908 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.224937 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.224955 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.224883 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225045 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225058 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225068 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225073 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225097 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-25 05:40:55.225087437 +0000 UTC m=+146.260965417 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225108 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:40:55.225103848 +0000 UTC m=+146.260981827 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225131 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:55.225122272 +0000 UTC m=+146.261000252 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225142 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225150 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225157 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225162 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225172 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-25 05:40:55.225167658 +0000 UTC m=+146.261045639 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.225182 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-25 05:40:55.225176385 +0000 UTC m=+146.261054364 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.227107 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.227131 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.227138 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.227146 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.227153 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.234839 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:51Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.234944 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.235737 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.235768 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.235777 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.235788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.235796 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.328610 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.328651 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.329660 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.329685 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.329771 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:51 crc kubenswrapper[4728]: E0125 05:39:51.329880 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.336856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.336883 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.336907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.336920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.336929 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.339013 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 01:54:42.591861138 +0000 UTC Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.438637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.438657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.438664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.438674 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.438681 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.540256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.540286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.540295 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.540353 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.540366 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.644874 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.644964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.644975 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.645023 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.645039 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.750313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.750754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.750767 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.750782 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.750792 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.853638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.853670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.853683 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.853699 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.853710 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.955513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.955550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.955560 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.955575 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:51 crc kubenswrapper[4728]: I0125 05:39:51.955585 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:51Z","lastTransitionTime":"2026-01-25T05:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.057216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.057257 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.057267 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.057285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.057298 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.159382 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.159417 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.159432 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.159444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.159453 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.261566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.261600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.261608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.261628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.261637 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.328199 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:52 crc kubenswrapper[4728]: E0125 05:39:52.328308 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.339719 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 01:57:00.557313612 +0000 UTC Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.362742 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.362785 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.362794 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.362806 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.362815 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.464355 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.464400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.464409 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.464418 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.464429 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.567177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.567210 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.567222 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.567236 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.567245 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.669198 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.669245 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.669255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.669267 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.669277 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.770887 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.770922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.770933 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.770944 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.770953 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.873027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.873067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.873078 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.873094 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.873106 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.975078 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.975101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.975109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.975119 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:52 crc kubenswrapper[4728]: I0125 05:39:52.975127 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:52Z","lastTransitionTime":"2026-01-25T05:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.077001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.077037 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.077047 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.077061 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.077069 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.179575 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.179625 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.179634 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.179652 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.179663 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.281833 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.281872 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.281882 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.281894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.282090 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.328739 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.328799 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.328739 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:53 crc kubenswrapper[4728]: E0125 05:39:53.328859 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:53 crc kubenswrapper[4728]: E0125 05:39:53.328929 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:53 crc kubenswrapper[4728]: E0125 05:39:53.328986 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.340335 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:54:55.173996986 +0000 UTC Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.384368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.384419 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.384430 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.384447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.384462 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.485996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.486036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.486045 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.486056 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.486066 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.589414 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.589446 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.589455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.589475 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.589488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.691111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.691166 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.691177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.691192 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.691212 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.793056 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.793084 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.793092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.793102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.793109 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.894739 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.894783 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.894793 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.894804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.894812 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.997157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.997188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.997199 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.997216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:53 crc kubenswrapper[4728]: I0125 05:39:53.997226 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:53Z","lastTransitionTime":"2026-01-25T05:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.099818 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.099851 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.099862 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.099876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.099886 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.201835 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.201887 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.201900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.201922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.201938 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.304284 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.304342 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.304354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.304366 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.304375 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.328796 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:54 crc kubenswrapper[4728]: E0125 05:39:54.328921 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.341169 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 09:44:31.782409389 +0000 UTC Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.406526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.406553 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.406565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.406578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.406590 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.508154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.508178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.508187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.508198 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.508206 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.610067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.610106 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.610114 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.610125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.610134 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.711428 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.711452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.711459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.711469 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.711477 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.812868 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.812892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.812902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.812912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.812918 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.914408 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.914440 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.914468 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.914481 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:54 crc kubenswrapper[4728]: I0125 05:39:54.914489 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:54Z","lastTransitionTime":"2026-01-25T05:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.016991 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.017025 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.017035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.017047 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.017055 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.119138 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.119168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.119178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.119192 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.119199 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.220888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.220940 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.220952 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.220970 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.220980 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.323305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.323348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.323358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.323370 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.323378 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.328828 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.328871 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:55 crc kubenswrapper[4728]: E0125 05:39:55.328915 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.328835 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:55 crc kubenswrapper[4728]: E0125 05:39:55.329022 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:55 crc kubenswrapper[4728]: E0125 05:39:55.329101 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.341863 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 14:56:41.760592709 +0000 UTC Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.424575 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.424618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.424627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.424637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.424645 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.526019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.526057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.526068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.526082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.526092 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.628229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.628430 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.628509 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.628569 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.628617 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.730443 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.730581 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.730668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.730729 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.730798 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.832690 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.832803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.832878 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.832957 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.833008 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.934601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.934725 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.934793 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.934853 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:55 crc kubenswrapper[4728]: I0125 05:39:55.934911 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:55Z","lastTransitionTime":"2026-01-25T05:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.036546 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.036656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.036714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.036780 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.036846 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.138686 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.138712 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.138719 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.138728 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.138735 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.240843 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.240879 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.240887 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.240896 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.240904 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.328917 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:56 crc kubenswrapper[4728]: E0125 05:39:56.329147 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.342080 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 17:28:14.74026836 +0000 UTC Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.342677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.342746 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.342773 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.342797 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.342812 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.445372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.445412 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.445420 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.445431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.445444 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.547001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.547032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.547041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.547052 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.547064 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.649032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.649060 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.649070 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.649082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.649091 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.750478 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.750588 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.750664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.750730 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.750810 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.852214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.852271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.852291 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.852303 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.852311 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.954411 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.954478 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.954490 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.954516 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:56 crc kubenswrapper[4728]: I0125 05:39:56.954562 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:56Z","lastTransitionTime":"2026-01-25T05:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.056266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.056291 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.056299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.056310 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.056339 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.158581 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.158607 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.158616 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.158626 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.158633 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.259935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.259979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.259991 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.260007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.260022 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.329547 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.329640 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:57 crc kubenswrapper[4728]: E0125 05:39:57.329726 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.329780 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:57 crc kubenswrapper[4728]: E0125 05:39:57.329910 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:57 crc kubenswrapper[4728]: E0125 05:39:57.329992 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.342889 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 06:02:42.076461105 +0000 UTC Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.362050 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.362072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.362080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.362090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.362097 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.463899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.463927 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.463935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.463944 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.463952 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.565860 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.565886 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.565894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.565903 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.565910 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.667358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.667390 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.667418 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.667429 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.667438 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.768266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.768288 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.768296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.768304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.768310 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.870025 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.870049 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.870058 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.870067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.870075 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.971624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.971648 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.971656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.971665 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:57 crc kubenswrapper[4728]: I0125 05:39:57.971673 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:57Z","lastTransitionTime":"2026-01-25T05:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.073590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.073611 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.073619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.073627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.073634 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.175514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.175534 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.175542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.175551 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.175558 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.277713 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.277739 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.277747 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.277773 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.277780 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.328821 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:39:58 crc kubenswrapper[4728]: E0125 05:39:58.328915 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.343113 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 23:29:02.590065433 +0000 UTC Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.379486 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.379518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.379526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.379536 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.379542 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.480584 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.480615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.480625 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.480638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.480647 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.582086 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.582121 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.582130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.582144 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.582155 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.683396 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.683418 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.683426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.683435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.683442 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.785487 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.785517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.785526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.785687 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.785695 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.887947 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.887974 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.887982 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.887992 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.888014 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.989895 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.989942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.989951 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.989971 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:58 crc kubenswrapper[4728]: I0125 05:39:58.989982 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:58Z","lastTransitionTime":"2026-01-25T05:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.091273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.091305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.091314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.091348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.091356 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.192873 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.192902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.192912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.192923 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.192931 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.294789 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.294831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.294846 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.294862 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.294872 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.328257 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.328290 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.328336 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:39:59 crc kubenswrapper[4728]: E0125 05:39:59.328459 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:39:59 crc kubenswrapper[4728]: E0125 05:39:59.328571 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:39:59 crc kubenswrapper[4728]: E0125 05:39:59.328665 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.339622 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.344017 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 10:05:56.572843747 +0000 UTC Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.346628 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.358300 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.366199 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.373868 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.382953 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.389336 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.396068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.396104 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.396118 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.396136 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.396147 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.403012 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.411910 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.419756 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.426794 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.433562 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.441046 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.448265 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.460673 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:47Z\\\",\\\"message\\\":\\\"o:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000849 6814 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0125 05:39:47.000853 6814 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0125 05:39:47.000856 6814 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000526 6814 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0125 05:39:47.000551 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0125 05:39:47.000661 6814 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0125 05:39:46.999920 6814 services_controller.go:445] Built service openshift-kube-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:47.000920 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.470867 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41738147-157e-4427-87e1-8c96482b330c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f72c469e82b896507fa17efcce786a2a6851270684c72877a47d0179b5ac020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.483197 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.490426 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.496897 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:39:59Z is after 2025-08-24T17:21:41Z" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.498071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.498103 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.498131 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.498150 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.498164 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.600437 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.600480 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.600490 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.600502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.600513 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.702564 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.702597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.702606 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.702621 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.702632 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.804134 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.804163 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.804172 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.804186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.804198 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.906071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.906102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.906111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.906123 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:39:59 crc kubenswrapper[4728]: I0125 05:39:59.906130 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:39:59Z","lastTransitionTime":"2026-01-25T05:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.007990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.008023 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.008031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.008044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.008054 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.109938 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.109966 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.109976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.109987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.109995 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.211658 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.211691 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.211702 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.211734 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.211744 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.317002 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.317034 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.317044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.317372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.317392 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.328355 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:00 crc kubenswrapper[4728]: E0125 05:40:00.328457 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.344600 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 03:23:40.087411441 +0000 UTC Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.418893 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.418958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.418968 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.418978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.418986 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.520927 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.520953 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.520962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.520974 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.520981 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.622724 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.622752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.622761 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.622780 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.622787 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.724746 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.724788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.724799 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.724811 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.724819 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.826646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.826674 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.826683 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.826694 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.826702 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.928218 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.928251 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.928260 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.928272 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:00 crc kubenswrapper[4728]: I0125 05:40:00.928283 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:00Z","lastTransitionTime":"2026-01-25T05:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.029558 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.029593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.029603 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.029613 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.029621 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.131410 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.131438 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.131447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.131459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.131467 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.232680 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.232707 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.232715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.232725 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.232747 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.329427 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.329459 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.329521 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.329618 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.329668 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.329777 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.334512 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.334538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.334547 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.334557 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.334565 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.344951 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 02:08:29.085454818 +0000 UTC Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.436653 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.436680 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.436688 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.436698 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.436707 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.530962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.530988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.530996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.531005 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.531011 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.541217 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.544388 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.544413 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.544421 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.544435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.544442 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.554308 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.557028 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.557099 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.557111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.557124 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.557134 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.567170 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.569187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.569216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.569225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.569237 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.569244 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.578334 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.580410 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.580437 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.580447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.580458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.580467 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.588760 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-25T05:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f0c3d81c-5f2b-4a0c-8bf5-076bd1019cfc\\\",\\\"systemUUID\\\":\\\"3ea98fc6-5f41-42ee-97d9-1061312a21b0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:01Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:01 crc kubenswrapper[4728]: E0125 05:40:01.588868 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.589805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.589825 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.589834 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.589843 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.589850 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.691529 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.691738 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.692379 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.692412 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.692426 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.794363 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.794389 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.794399 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.794408 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.794416 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.895894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.895921 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.895930 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.895941 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.895949 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.997914 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.998108 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.998124 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.998138 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:01 crc kubenswrapper[4728]: I0125 05:40:01.998148 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:01Z","lastTransitionTime":"2026-01-25T05:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.100472 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.100500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.100510 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.100520 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.100528 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.202264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.202289 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.202297 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.202306 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.202314 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.303381 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.303426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.303447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.303457 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.303471 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.327915 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:02 crc kubenswrapper[4728]: E0125 05:40:02.328020 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.345509 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 05:27:58.196653331 +0000 UTC Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.404424 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.404443 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.404450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.404459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.404466 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.506078 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.506130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.506140 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.506149 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.506157 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.607668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.607689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.607697 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.607706 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.607713 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.709224 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.709254 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.709265 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.709276 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.709285 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.810464 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.810506 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.810514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.810526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.810534 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.911939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.911959 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.911967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.911976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:02 crc kubenswrapper[4728]: I0125 05:40:02.911984 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:02Z","lastTransitionTime":"2026-01-25T05:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.013476 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.013504 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.013513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.013524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.013532 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.114958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.114978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.115000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.115009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.115017 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.216145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.216193 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.216203 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.216221 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.216230 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.318126 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.318152 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.318160 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.318170 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.318176 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.328368 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.328386 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.328374 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:03 crc kubenswrapper[4728]: E0125 05:40:03.328472 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:03 crc kubenswrapper[4728]: E0125 05:40:03.328685 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:03 crc kubenswrapper[4728]: E0125 05:40:03.328860 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.328921 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:40:03 crc kubenswrapper[4728]: E0125 05:40:03.329052 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.345835 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 19:43:59.326714582 +0000 UTC Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.419424 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.419442 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.419450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.419459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.419466 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.521077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.521106 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.521115 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.521128 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.521137 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.623143 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.623177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.623186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.623201 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.623211 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.724877 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.724920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.724929 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.724944 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.724954 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.826300 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.826337 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.826344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.826353 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.826359 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.927355 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.927383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.927391 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.927426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:03 crc kubenswrapper[4728]: I0125 05:40:03.927443 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:03Z","lastTransitionTime":"2026-01-25T05:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.029036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.029069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.029079 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.029094 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.029101 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.130894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.130932 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.130972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.130986 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.130995 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.232192 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.232250 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.232264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.232287 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.232299 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.327992 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:04 crc kubenswrapper[4728]: E0125 05:40:04.328121 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.334348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.334373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.334383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.334397 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.334409 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.346816 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 23:53:23.912109382 +0000 UTC Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.436240 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.436315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.436368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.436388 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.436406 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.537990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.538029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.538040 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.538052 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.538064 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.639817 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.639860 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.639871 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.639887 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.639901 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.741813 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.741840 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.741848 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.741860 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.741867 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.843101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.843129 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.843141 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.843171 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.843179 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.944854 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.944886 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.944895 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.944907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:04 crc kubenswrapper[4728]: I0125 05:40:04.944918 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:04Z","lastTransitionTime":"2026-01-25T05:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.046558 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.046613 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.046627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.046648 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.046664 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.148082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.148130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.148144 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.148165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.148177 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.249372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.249396 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.249404 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.249416 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.249425 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.328825 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.328864 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.328988 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:05 crc kubenswrapper[4728]: E0125 05:40:05.329075 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:05 crc kubenswrapper[4728]: E0125 05:40:05.329113 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:05 crc kubenswrapper[4728]: E0125 05:40:05.329167 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.347720 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 16:41:45.984565959 +0000 UTC Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.350831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.350867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.350877 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.350888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.350896 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.452565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.452599 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.452608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.452622 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.452631 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.554866 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.554897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.554906 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.554917 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.554925 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.656666 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.656687 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.656695 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.656707 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.656716 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.758431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.758457 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.758466 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.758479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.758486 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.860212 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.860245 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.860256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.860268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.860279 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.962459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.962488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.962497 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.962507 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:05 crc kubenswrapper[4728]: I0125 05:40:05.962518 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:05Z","lastTransitionTime":"2026-01-25T05:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.064412 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.064456 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.064474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.064491 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.064504 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.165980 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.166007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.166016 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.166025 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.166037 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.267828 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.267847 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.267856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.267866 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.267873 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.328391 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:06 crc kubenswrapper[4728]: E0125 05:40:06.328476 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.348854 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 10:31:45.364756358 +0000 UTC Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.368927 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.368964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.368974 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.368990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.369003 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.470835 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.470864 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.470873 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.470885 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.470893 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.572876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.572903 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.572912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.572920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.572926 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.674069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.674099 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.674107 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.674121 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.674129 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.776049 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.776077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.776086 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.776097 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.776104 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.877400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.877439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.877451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.877465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.877476 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.979174 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.979219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.979227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.979241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:06 crc kubenswrapper[4728]: I0125 05:40:06.979251 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:06Z","lastTransitionTime":"2026-01-25T05:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.080774 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.080817 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.080827 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.080838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.080846 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.183218 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.183239 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.183247 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.183256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.183262 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.284623 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.284653 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.284663 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.284676 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.284685 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.327983 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.328020 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:07 crc kubenswrapper[4728]: E0125 05:40:07.328096 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:07 crc kubenswrapper[4728]: E0125 05:40:07.328157 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.328168 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:07 crc kubenswrapper[4728]: E0125 05:40:07.328274 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.348969 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 03:12:57.822887354 +0000 UTC Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.386255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.386288 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.386300 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.386310 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.386336 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.487969 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.487997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.488007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.488017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.488025 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.589693 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.589926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.589994 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.590057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.590191 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.691852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.691920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.691932 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.691945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.691953 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.793867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.793888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.793895 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.793904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.793911 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.895465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.895507 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.895518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.895538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.895549 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.997455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.997492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.997501 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.997514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:07 crc kubenswrapper[4728]: I0125 05:40:07.997525 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:07Z","lastTransitionTime":"2026-01-25T05:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.099684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.099716 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.099724 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.099734 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.099743 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.201277 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.201310 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.201337 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.201351 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.201360 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.302473 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.302502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.302512 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.302524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.302532 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.328020 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:08 crc kubenswrapper[4728]: E0125 05:40:08.328188 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.349362 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 17:52:53.686303447 +0000 UTC Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.356645 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:08 crc kubenswrapper[4728]: E0125 05:40:08.356854 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:40:08 crc kubenswrapper[4728]: E0125 05:40:08.356963 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs podName:accc0eb5-6067-4ab9-bbab-6d2ae898942f nodeName:}" failed. No retries permitted until 2026-01-25 05:41:12.356946959 +0000 UTC m=+163.392824929 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs") pod "network-metrics-daemon-k5pj4" (UID: "accc0eb5-6067-4ab9-bbab-6d2ae898942f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.404265 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.404303 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.404313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.404344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.404352 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.506142 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.506171 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.506182 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.506195 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.506204 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.607636 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.607660 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.607667 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.607676 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.607683 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.709550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.709576 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.709585 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.709593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.709600 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.811040 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.811076 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.811086 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.811095 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.811103 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.912690 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.912713 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.912722 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.912733 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:08 crc kubenswrapper[4728]: I0125 05:40:08.912742 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:08Z","lastTransitionTime":"2026-01-25T05:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.015025 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.015058 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.015092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.015106 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.015115 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.116663 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.116757 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.116825 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.116881 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.116938 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.218981 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.219009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.219018 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.219027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.219035 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.320672 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.320698 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.320706 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.320715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.320723 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.327890 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.327911 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.327941 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:09 crc kubenswrapper[4728]: E0125 05:40:09.328034 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:09 crc kubenswrapper[4728]: E0125 05:40:09.328219 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:09 crc kubenswrapper[4728]: E0125 05:40:09.328358 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.338067 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.348095 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191ff4fd-0d05-4097-b136-5f443120b4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11e07f91a8c6b61f10ea5864f1266181506aa200b47f81fe26fd7273c70e6290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec98eb02ee13a85ab7406f8dd7797f04eabdfa017266b3f29f08c052114ea91e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fd40465d53c59f05dbce4285df1632f0eaa34673e4e1dcf9e5dc3f1d148558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://942133f4256465324eaf8335045197da8c65b60802269660ed3bbaf217b6883c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0de1829eb6b0ea70024a951150793b83f03d8a0d112a8fbf7caf147fec4008af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0d80aab72f4da70c39b2c99811489b6e3e39ba34382837efedc7ab444ca9fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56303eceaff440406c769be2b7acc9444be9853ab658e292165a6e72507bc071\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mcwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m8nhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.349500 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 16:18:23.095501922 +0000 UTC Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.355497 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a01f596a-1896-40e2-b9e8-990c387845a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1278a7425d68bf272833ccbe9dc63ca01ba48908d00379f88dc46cc5ced60f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a14eab95fd3176d4bb0399ae700d0a9a8b9cc84b29376f47445ed3c79bcc99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5tw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jwzv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.362672 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"accc0eb5-6067-4ab9-bbab-6d2ae898942f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:39:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k5pj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.370611 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2574ace81c31dd6acb796df5ccce9ff5a1292e2dba71025844c2ecdb9688327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.378831 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.388768 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.398206 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4422655c9b7c96154e589248665f204dbdc5393b254ec6da70976723eeef70d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.405672 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vdkq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b49f7c-8776-49b9-9897-6553e57e202b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2423d5feab43ce29770d55112f48205e485f369e7f2546071048a6b3f634c7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vdkq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.414031 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdxw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ffc038-3d70-4d2c-b150-e8529f622238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:37Z\\\",\\\"message\\\":\\\"2026-01-25T05:38:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc\\\\n2026-01-25T05:38:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a081c60d-160f-4f23-ada8-9ecc774a3dfc to /host/opt/cni/bin/\\\\n2026-01-25T05:38:52Z [verbose] multus-daemon started\\\\n2026-01-25T05:38:52Z [verbose] Readiness Indicator file check\\\\n2026-01-25T05:39:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5klfk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdxw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.421969 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.422003 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.422013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.422025 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.422036 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.425409 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a4369e-a82b-4d74-afeb-fc4b69b0057a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.433881 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faf39575a03b4164744142bfa645113800a8abdea6a1a9f534e7bf959a8d7d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa4af289de017a1034ed9b2dee5519ec556efe28fda0c541301a40c5375117b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.446830 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-25T05:39:47Z\\\",\\\"message\\\":\\\"o:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000849 6814 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0125 05:39:47.000853 6814 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0125 05:39:47.000856 6814 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0125 05:39:47.000526 6814 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0125 05:39:47.000551 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0125 05:39:47.000661 6814 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0125 05:39:46.999920 6814 services_controller.go:445] Built service openshift-kube-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0125 05:39:47.000920 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-25T05:39:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dw4hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zmqrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.455023 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9170842-af41-4dcb-950b-ecef67e8c9a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fed65cc23b3f9a55c1ba82a257b4d40f8ed7008c190772518aec1331db4a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7fb66c50f8d12b5b3ac402244a4fd93db5a6eba11aea606845ec79ec7774e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd16087a145e442cc3c78301dd194b892b45d59c1d62b4f57d9753268030e16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.462857 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33554313-3b32-42fa-9af5-024d4a0e3ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32514b34debf4407f534a688d66af9286be17c82722bbb279f853ec24a5453a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ba4b1e1e82690d84242a207e92e937b8f0c0e560bb3f230af1580f264d72da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36e5feac5c2049b176ba77ec6369e3f510565b95b81ba8461921e2872367f132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2700cd9a526e4b4b7e2b0044a34147b9e1ee4bc645603acaec047ef059f7080d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.469883 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b03c82ad79aa19056242796eadb002aefdd1bfa6fdb597c0e70d76e164d282d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkx89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w9dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.476507 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5kw62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12499e53-158e-42e5-ab05-3b37974a32e9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9acdc3b54c644380168864c9414051beca293efd1a132de8359843946f031f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkcwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5kw62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.483263 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41738147-157e-4427-87e1-8c96482b330c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f72c469e82b896507fa17efcce786a2a6851270684c72877a47d0179b5ac020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb10dfd08d28b6913bc55c2d4e3e8326911f199c062b6a5c9559ee1adaf00ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.496618 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc31034-a4bd-4b92-8511-6ac15a8a5952\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-25T05:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7d4b6945a40a17e29e7ace9e998cf6b924c60803161efb8537c01d92c5ecd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67efd49e8bea5ea2da9f57da947e2ecd5bd3567bda52e32ef754ffaf3dcc58d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20cdb7eb6cd3d181d0f1a8444b1b106bfd4b219c6eb6e250f3b07440fbf3e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82716e3f8ea9497cec1b54e6c5878ca5d98fa56ed380fda5643b2fd54150c428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffb22efd53d8d4dd7cdd094ae021945a5fa7451784782400f5cdb6defd8925da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-25T05:38:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8beed368c36da8e9462cb5fb218f1e82cb416b3c6c761c2b807e2a2f6d5b51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5378c65e122fcd503427761ec5b760692197082b4b587648db1b6f9c2e743f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4adc8c980c4f1ef7f0523db1b2801694177413ef19a5d93fe94a14499099f06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-25T05:38:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-25T05:38:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-25T05:38:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-25T05:40:09Z is after 2025-08-24T17:21:41Z" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.523700 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.523735 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.523746 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.523760 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.523770 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.625542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.625571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.625582 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.625592 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.625599 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.727598 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.727626 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.727635 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.727646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.727653 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.829072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.829102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.829112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.829125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.829136 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.930918 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.930945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.930953 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.930964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:09 crc kubenswrapper[4728]: I0125 05:40:09.930973 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:09Z","lastTransitionTime":"2026-01-25T05:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.032900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.032923 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.032932 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.032942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.032949 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.134832 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.135045 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.135117 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.135188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.135241 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.237580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.237611 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.237619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.237631 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.237639 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.328642 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.328659 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:10 crc kubenswrapper[4728]: E0125 05:40:10.328762 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:10 crc kubenswrapper[4728]: E0125 05:40:10.328844 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.341213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.341234 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.341242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.341252 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.341260 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.349797 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 17:17:52.214491987 +0000 UTC Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.443196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.443296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.443386 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.443451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.443514 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.545031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.545059 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.545067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.545077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.545107 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.646831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.646886 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.646898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.646909 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.646917 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.748207 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.748304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.748397 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.748480 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.748542 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.850156 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.850184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.850193 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.850204 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.850212 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.952075 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.952194 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.952268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.952352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:10 crc kubenswrapper[4728]: I0125 05:40:10.952427 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:10Z","lastTransitionTime":"2026-01-25T05:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.054027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.054135 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.054205 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.054264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.054351 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.156121 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.156154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.156163 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.156177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.156189 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.258192 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.258230 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.258238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.258251 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.258260 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.328299 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.328331 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:11 crc kubenswrapper[4728]: E0125 05:40:11.328485 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:11 crc kubenswrapper[4728]: E0125 05:40:11.328557 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.350740 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 18:15:00.357057635 +0000 UTC Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.360224 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.360255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.360265 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.360276 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.360285 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.462273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.462297 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.462305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.462316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.462340 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.564032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.564061 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.564071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.564082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.564091 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.666184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.666219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.666229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.666242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.666253 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.767775 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.767829 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.767840 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.767852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.767863 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.833540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.833608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.833618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.833638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.833651 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-25T05:40:11Z","lastTransitionTime":"2026-01-25T05:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.870261 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q"] Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.870741 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.872227 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.872766 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.873001 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.873971 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.952841 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m8nhm" podStartSLOduration=81.952824736 podStartE2EDuration="1m21.952824736s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:11.952718045 +0000 UTC m=+102.988596024" watchObservedRunningTime="2026-01-25 05:40:11.952824736 +0000 UTC m=+102.988702716" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.971730 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jwzv9" podStartSLOduration=80.971713331 podStartE2EDuration="1m20.971713331s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:11.961101567 +0000 UTC m=+102.996979547" watchObservedRunningTime="2026-01-25 05:40:11.971713331 +0000 UTC m=+103.007591311" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.979470 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kdxw7" podStartSLOduration=81.979462427 podStartE2EDuration="1m21.979462427s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:11.972069433 +0000 UTC m=+103.007947413" watchObservedRunningTime="2026-01-25 05:40:11.979462427 +0000 UTC m=+103.015340408" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.986054 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fe70b350-b780-419c-bc57-6056daf1ff06-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.986204 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fe70b350-b780-419c-bc57-6056daf1ff06-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.986297 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe70b350-b780-419c-bc57-6056daf1ff06-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.986423 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe70b350-b780-419c-bc57-6056daf1ff06-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.986509 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe70b350-b780-419c-bc57-6056daf1ff06-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:11 crc kubenswrapper[4728]: I0125 05:40:11.991387 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=85.991375647 podStartE2EDuration="1m25.991375647s" podCreationTimestamp="2026-01-25 05:38:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:11.990844535 +0000 UTC m=+103.026722515" watchObservedRunningTime="2026-01-25 05:40:11.991375647 +0000 UTC m=+103.027253627" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.027049 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vdkq2" podStartSLOduration=82.027036307 podStartE2EDuration="1m22.027036307s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:12.02673106 +0000 UTC m=+103.062609041" watchObservedRunningTime="2026-01-25 05:40:12.027036307 +0000 UTC m=+103.062914287" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.036950 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.036938203 podStartE2EDuration="1m25.036938203s" podCreationTimestamp="2026-01-25 05:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:12.036758915 +0000 UTC m=+103.072636894" watchObservedRunningTime="2026-01-25 05:40:12.036938203 +0000 UTC m=+103.072816183" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.049038 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=55.04903043 podStartE2EDuration="55.04903043s" podCreationTimestamp="2026-01-25 05:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:12.048921955 +0000 UTC m=+103.084799935" watchObservedRunningTime="2026-01-25 05:40:12.04903043 +0000 UTC m=+103.084908410" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.087388 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe70b350-b780-419c-bc57-6056daf1ff06-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.087431 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe70b350-b780-419c-bc57-6056daf1ff06-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.087481 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fe70b350-b780-419c-bc57-6056daf1ff06-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.087507 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fe70b350-b780-419c-bc57-6056daf1ff06-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.087525 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe70b350-b780-419c-bc57-6056daf1ff06-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.087570 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fe70b350-b780-419c-bc57-6056daf1ff06-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.087682 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fe70b350-b780-419c-bc57-6056daf1ff06-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.088527 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe70b350-b780-419c-bc57-6056daf1ff06-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.091685 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.091649403 podStartE2EDuration="29.091649403s" podCreationTimestamp="2026-01-25 05:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:12.073850574 +0000 UTC m=+103.109728554" watchObservedRunningTime="2026-01-25 05:40:12.091649403 +0000 UTC m=+103.127527373" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.094966 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe70b350-b780-419c-bc57-6056daf1ff06-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.100482 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe70b350-b780-419c-bc57-6056daf1ff06-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q778q\" (UID: \"fe70b350-b780-419c-bc57-6056daf1ff06\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.103141 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podStartSLOduration=82.103124617 podStartE2EDuration="1m22.103124617s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:12.102743859 +0000 UTC m=+103.138621839" watchObservedRunningTime="2026-01-25 05:40:12.103124617 +0000 UTC m=+103.139002597" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.103777 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=84.103772178 podStartE2EDuration="1m24.103772178s" podCreationTimestamp="2026-01-25 05:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:12.09233677 +0000 UTC m=+103.128214750" watchObservedRunningTime="2026-01-25 05:40:12.103772178 +0000 UTC m=+103.139650158" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.181284 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.328489 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.328487 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:12 crc kubenswrapper[4728]: E0125 05:40:12.328621 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:12 crc kubenswrapper[4728]: E0125 05:40:12.328733 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.351668 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 17:26:01.191920864 +0000 UTC Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.351722 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.357617 4728 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.705774 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" event={"ID":"fe70b350-b780-419c-bc57-6056daf1ff06","Type":"ContainerStarted","Data":"92cb63859a2e6a3e53939a43022cb8b8b1d3f20a756ae2cf1f6cebe78a940973"} Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.705847 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" event={"ID":"fe70b350-b780-419c-bc57-6056daf1ff06","Type":"ContainerStarted","Data":"76163d6aa9a5bbc477d9e940866428e38b0dbee000f077333548f8449605ff8e"} Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.715664 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5kw62" podStartSLOduration=82.715651919 podStartE2EDuration="1m22.715651919s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:12.111282364 +0000 UTC m=+103.147160344" watchObservedRunningTime="2026-01-25 05:40:12.715651919 +0000 UTC m=+103.751529899" Jan 25 05:40:12 crc kubenswrapper[4728]: I0125 05:40:12.716355 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q778q" podStartSLOduration=82.716349884 podStartE2EDuration="1m22.716349884s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:12.715364787 +0000 UTC m=+103.751242767" watchObservedRunningTime="2026-01-25 05:40:12.716349884 +0000 UTC m=+103.752227865" Jan 25 05:40:13 crc kubenswrapper[4728]: I0125 05:40:13.328347 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:13 crc kubenswrapper[4728]: I0125 05:40:13.328378 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:13 crc kubenswrapper[4728]: E0125 05:40:13.328724 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:13 crc kubenswrapper[4728]: E0125 05:40:13.328853 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:14 crc kubenswrapper[4728]: I0125 05:40:14.328439 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:14 crc kubenswrapper[4728]: I0125 05:40:14.328439 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:14 crc kubenswrapper[4728]: E0125 05:40:14.328543 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:14 crc kubenswrapper[4728]: E0125 05:40:14.328596 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:15 crc kubenswrapper[4728]: I0125 05:40:15.328230 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:15 crc kubenswrapper[4728]: I0125 05:40:15.328374 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:15 crc kubenswrapper[4728]: E0125 05:40:15.328523 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:15 crc kubenswrapper[4728]: E0125 05:40:15.328636 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:16 crc kubenswrapper[4728]: I0125 05:40:16.328295 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:16 crc kubenswrapper[4728]: E0125 05:40:16.328431 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:16 crc kubenswrapper[4728]: I0125 05:40:16.328446 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:16 crc kubenswrapper[4728]: E0125 05:40:16.328517 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:17 crc kubenswrapper[4728]: I0125 05:40:17.329028 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:17 crc kubenswrapper[4728]: I0125 05:40:17.329135 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:17 crc kubenswrapper[4728]: E0125 05:40:17.329245 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:17 crc kubenswrapper[4728]: E0125 05:40:17.329724 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:17 crc kubenswrapper[4728]: I0125 05:40:17.330069 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:40:17 crc kubenswrapper[4728]: E0125 05:40:17.330229 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zmqrx_openshift-ovn-kubernetes(5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" Jan 25 05:40:18 crc kubenswrapper[4728]: I0125 05:40:18.328551 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:18 crc kubenswrapper[4728]: E0125 05:40:18.328657 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:18 crc kubenswrapper[4728]: I0125 05:40:18.328555 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:18 crc kubenswrapper[4728]: E0125 05:40:18.328720 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:19 crc kubenswrapper[4728]: I0125 05:40:19.327793 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:19 crc kubenswrapper[4728]: I0125 05:40:19.327794 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:19 crc kubenswrapper[4728]: E0125 05:40:19.328503 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:19 crc kubenswrapper[4728]: E0125 05:40:19.328658 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:20 crc kubenswrapper[4728]: I0125 05:40:20.327868 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:20 crc kubenswrapper[4728]: E0125 05:40:20.327990 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:20 crc kubenswrapper[4728]: I0125 05:40:20.327868 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:20 crc kubenswrapper[4728]: E0125 05:40:20.328059 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:21 crc kubenswrapper[4728]: I0125 05:40:21.328517 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:21 crc kubenswrapper[4728]: I0125 05:40:21.328575 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:21 crc kubenswrapper[4728]: E0125 05:40:21.328646 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:21 crc kubenswrapper[4728]: E0125 05:40:21.328754 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:22 crc kubenswrapper[4728]: I0125 05:40:22.328082 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:22 crc kubenswrapper[4728]: I0125 05:40:22.328203 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:22 crc kubenswrapper[4728]: E0125 05:40:22.328311 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:22 crc kubenswrapper[4728]: E0125 05:40:22.328628 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:23 crc kubenswrapper[4728]: I0125 05:40:23.327997 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:23 crc kubenswrapper[4728]: E0125 05:40:23.328090 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:23 crc kubenswrapper[4728]: I0125 05:40:23.328220 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:23 crc kubenswrapper[4728]: E0125 05:40:23.328300 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:23 crc kubenswrapper[4728]: I0125 05:40:23.731718 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/1.log" Jan 25 05:40:23 crc kubenswrapper[4728]: I0125 05:40:23.732225 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/0.log" Jan 25 05:40:23 crc kubenswrapper[4728]: I0125 05:40:23.732268 4728 generic.go:334] "Generic (PLEG): container finished" podID="c2ffc038-3d70-4d2c-b150-e8529f622238" containerID="18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991" exitCode=1 Jan 25 05:40:23 crc kubenswrapper[4728]: I0125 05:40:23.732293 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdxw7" event={"ID":"c2ffc038-3d70-4d2c-b150-e8529f622238","Type":"ContainerDied","Data":"18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991"} Jan 25 05:40:23 crc kubenswrapper[4728]: I0125 05:40:23.732341 4728 scope.go:117] "RemoveContainer" containerID="9ae73080be47485504ed9d4362f39cfc08740ca6bbf4cc126c5fc67d95226cd5" Jan 25 05:40:23 crc kubenswrapper[4728]: I0125 05:40:23.732595 4728 scope.go:117] "RemoveContainer" containerID="18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991" Jan 25 05:40:23 crc kubenswrapper[4728]: E0125 05:40:23.732733 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-kdxw7_openshift-multus(c2ffc038-3d70-4d2c-b150-e8529f622238)\"" pod="openshift-multus/multus-kdxw7" podUID="c2ffc038-3d70-4d2c-b150-e8529f622238" Jan 25 05:40:24 crc kubenswrapper[4728]: I0125 05:40:24.328372 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:24 crc kubenswrapper[4728]: I0125 05:40:24.328405 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:24 crc kubenswrapper[4728]: E0125 05:40:24.328470 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:24 crc kubenswrapper[4728]: E0125 05:40:24.328530 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:24 crc kubenswrapper[4728]: I0125 05:40:24.735473 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/1.log" Jan 25 05:40:25 crc kubenswrapper[4728]: I0125 05:40:25.328802 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:25 crc kubenswrapper[4728]: I0125 05:40:25.328902 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:25 crc kubenswrapper[4728]: E0125 05:40:25.328987 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:25 crc kubenswrapper[4728]: E0125 05:40:25.329154 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:26 crc kubenswrapper[4728]: I0125 05:40:26.328223 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:26 crc kubenswrapper[4728]: I0125 05:40:26.328411 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:26 crc kubenswrapper[4728]: E0125 05:40:26.328468 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:26 crc kubenswrapper[4728]: E0125 05:40:26.328620 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:27 crc kubenswrapper[4728]: I0125 05:40:27.328675 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:27 crc kubenswrapper[4728]: I0125 05:40:27.328711 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:27 crc kubenswrapper[4728]: E0125 05:40:27.328944 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:27 crc kubenswrapper[4728]: E0125 05:40:27.329055 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.328716 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.328728 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:28 crc kubenswrapper[4728]: E0125 05:40:28.329046 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:28 crc kubenswrapper[4728]: E0125 05:40:28.329167 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.329183 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.744928 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/3.log" Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.747242 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerStarted","Data":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.747642 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.779842 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podStartSLOduration=98.779816783 podStartE2EDuration="1m38.779816783s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:28.778543442 +0000 UTC m=+119.814421422" watchObservedRunningTime="2026-01-25 05:40:28.779816783 +0000 UTC m=+119.815694764" Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.919561 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k5pj4"] Jan 25 05:40:28 crc kubenswrapper[4728]: I0125 05:40:28.919661 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:28 crc kubenswrapper[4728]: E0125 05:40:28.919746 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:29 crc kubenswrapper[4728]: I0125 05:40:29.328193 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:29 crc kubenswrapper[4728]: I0125 05:40:29.328227 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:29 crc kubenswrapper[4728]: E0125 05:40:29.329636 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:29 crc kubenswrapper[4728]: E0125 05:40:29.329750 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:29 crc kubenswrapper[4728]: E0125 05:40:29.393204 4728 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 25 05:40:29 crc kubenswrapper[4728]: E0125 05:40:29.397712 4728 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 25 05:40:30 crc kubenswrapper[4728]: I0125 05:40:30.328454 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:30 crc kubenswrapper[4728]: E0125 05:40:30.328560 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:30 crc kubenswrapper[4728]: I0125 05:40:30.328468 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:30 crc kubenswrapper[4728]: E0125 05:40:30.328761 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:31 crc kubenswrapper[4728]: I0125 05:40:31.328472 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:31 crc kubenswrapper[4728]: E0125 05:40:31.328575 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:31 crc kubenswrapper[4728]: I0125 05:40:31.328623 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:31 crc kubenswrapper[4728]: E0125 05:40:31.328737 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:32 crc kubenswrapper[4728]: I0125 05:40:32.328630 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:32 crc kubenswrapper[4728]: E0125 05:40:32.328724 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:32 crc kubenswrapper[4728]: I0125 05:40:32.328640 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:32 crc kubenswrapper[4728]: E0125 05:40:32.328886 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:33 crc kubenswrapper[4728]: I0125 05:40:33.328563 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:33 crc kubenswrapper[4728]: E0125 05:40:33.328722 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:33 crc kubenswrapper[4728]: I0125 05:40:33.328738 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:33 crc kubenswrapper[4728]: E0125 05:40:33.328840 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:34 crc kubenswrapper[4728]: I0125 05:40:34.328139 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:34 crc kubenswrapper[4728]: E0125 05:40:34.328457 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:34 crc kubenswrapper[4728]: I0125 05:40:34.328160 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:34 crc kubenswrapper[4728]: E0125 05:40:34.328697 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:34 crc kubenswrapper[4728]: E0125 05:40:34.399520 4728 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 25 05:40:35 crc kubenswrapper[4728]: I0125 05:40:35.328081 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:35 crc kubenswrapper[4728]: I0125 05:40:35.328122 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:35 crc kubenswrapper[4728]: E0125 05:40:35.328912 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:35 crc kubenswrapper[4728]: E0125 05:40:35.329043 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:35 crc kubenswrapper[4728]: I0125 05:40:35.329243 4728 scope.go:117] "RemoveContainer" containerID="18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991" Jan 25 05:40:35 crc kubenswrapper[4728]: I0125 05:40:35.763898 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/1.log" Jan 25 05:40:35 crc kubenswrapper[4728]: I0125 05:40:35.763944 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdxw7" event={"ID":"c2ffc038-3d70-4d2c-b150-e8529f622238","Type":"ContainerStarted","Data":"02594a5778eadcc12813b7374fa9212bd49759f86eda609cec8b87645a3c371e"} Jan 25 05:40:35 crc kubenswrapper[4728]: I0125 05:40:35.837155 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:40:36 crc kubenswrapper[4728]: I0125 05:40:36.328839 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:36 crc kubenswrapper[4728]: I0125 05:40:36.328857 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:36 crc kubenswrapper[4728]: E0125 05:40:36.329247 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:36 crc kubenswrapper[4728]: E0125 05:40:36.329593 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:37 crc kubenswrapper[4728]: I0125 05:40:37.329444 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:37 crc kubenswrapper[4728]: I0125 05:40:37.329539 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:37 crc kubenswrapper[4728]: E0125 05:40:37.329652 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:37 crc kubenswrapper[4728]: E0125 05:40:37.329714 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:38 crc kubenswrapper[4728]: I0125 05:40:38.328296 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:38 crc kubenswrapper[4728]: I0125 05:40:38.328309 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:38 crc kubenswrapper[4728]: E0125 05:40:38.328549 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 25 05:40:38 crc kubenswrapper[4728]: E0125 05:40:38.328439 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k5pj4" podUID="accc0eb5-6067-4ab9-bbab-6d2ae898942f" Jan 25 05:40:39 crc kubenswrapper[4728]: I0125 05:40:39.328556 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:39 crc kubenswrapper[4728]: I0125 05:40:39.328556 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:39 crc kubenswrapper[4728]: E0125 05:40:39.329843 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 25 05:40:39 crc kubenswrapper[4728]: E0125 05:40:39.329918 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 25 05:40:40 crc kubenswrapper[4728]: I0125 05:40:40.327951 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:40 crc kubenswrapper[4728]: I0125 05:40:40.328113 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:40:40 crc kubenswrapper[4728]: I0125 05:40:40.329745 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 25 05:40:40 crc kubenswrapper[4728]: I0125 05:40:40.329984 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 25 05:40:40 crc kubenswrapper[4728]: I0125 05:40:40.330041 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 25 05:40:40 crc kubenswrapper[4728]: I0125 05:40:40.330056 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 25 05:40:41 crc kubenswrapper[4728]: I0125 05:40:41.327894 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:41 crc kubenswrapper[4728]: I0125 05:40:41.327926 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:41 crc kubenswrapper[4728]: I0125 05:40:41.329548 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 25 05:40:41 crc kubenswrapper[4728]: I0125 05:40:41.330518 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.581256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.606808 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z8p7j"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.607168 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.607559 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cz94k"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.608043 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.608526 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.608717 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.609205 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sw8t2"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.609506 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.609742 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.610040 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.613358 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-24kpz"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.613591 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.614110 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.617508 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.617805 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.624943 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.624957 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.625041 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.625261 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.625307 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.625417 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.625901 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.625918 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.625946 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.626121 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.626239 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.626295 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629390 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-audit-policies\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629429 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629454 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln82q\" (UniqueName: \"kubernetes.io/projected/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-kube-api-access-ln82q\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629489 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652ad7d4-fb59-48ec-936b-305fa0b0966e-config\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629520 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-serving-cert\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629537 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7qk2\" (UniqueName: \"kubernetes.io/projected/41729c13-60fa-4005-9d4b-739071383860-kube-api-access-m7qk2\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629558 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6n4\" (UniqueName: \"kubernetes.io/projected/652ad7d4-fb59-48ec-936b-305fa0b0966e-kube-api-access-np6n4\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629577 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc59d\" (UniqueName: \"kubernetes.io/projected/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-kube-api-access-mc59d\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629595 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46p9m\" (UniqueName: \"kubernetes.io/projected/7523755e-9f8c-4740-93d2-e35cc4f9757d-kube-api-access-46p9m\") pod \"cluster-samples-operator-665b6dd947-hl5k8\" (UID: \"7523755e-9f8c-4740-93d2-e35cc4f9757d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629612 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-image-import-ca\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629631 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-serving-cert\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629659 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9488cfe-a86c-45af-9f75-5515ba6060ed-audit-dir\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629676 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f291fdb-786d-4cd1-b61f-f54de47908ff-node-pullsecrets\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629694 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629714 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-serving-cert\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629733 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-config\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629760 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629778 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-encryption-config\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629798 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-encryption-config\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629813 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcs6f\" (UniqueName: \"kubernetes.io/projected/1f291fdb-786d-4cd1-b61f-f54de47908ff-kube-api-access-tcs6f\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629839 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-client-ca\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629858 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41729c13-60fa-4005-9d4b-739071383860-serving-cert\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629878 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-audit\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629894 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-etcd-serving-ca\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629912 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-config\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629932 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629950 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7523755e-9f8c-4740-93d2-e35cc4f9757d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hl5k8\" (UID: \"7523755e-9f8c-4740-93d2-e35cc4f9757d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629969 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-serving-cert\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.629985 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/652ad7d4-fb59-48ec-936b-305fa0b0966e-images\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630004 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630023 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-service-ca-bundle\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630041 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-etcd-client\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630056 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-458d9\" (UniqueName: \"kubernetes.io/projected/b9488cfe-a86c-45af-9f75-5515ba6060ed-kube-api-access-458d9\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630075 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/652ad7d4-fb59-48ec-936b-305fa0b0966e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630092 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-config\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630117 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-etcd-client\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630137 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f291fdb-786d-4cd1-b61f-f54de47908ff-audit-dir\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630153 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-config\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.630172 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-client-ca\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.635692 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k2b65"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.636001 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.636216 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.636268 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.636495 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.636706 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.636856 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.636881 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637111 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637262 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637284 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637428 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637526 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637572 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637674 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637699 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.637847 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.638004 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.638158 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.638284 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.638678 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.638842 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.639507 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.639707 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.643366 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.643661 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.650271 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.650401 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.650557 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.650686 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.650769 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.650872 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.650929 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.651206 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.652866 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.652895 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ntzq4"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.653110 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.653225 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.653278 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.653373 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.653480 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.653571 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.653784 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.654248 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.654385 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.654488 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.654656 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.654734 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.654932 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nlfgj"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.655122 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.655144 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.655339 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.655522 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nlfgj" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.655561 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.655727 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.656933 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.657119 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.657610 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.657752 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.657962 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.657992 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.658151 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2jdpb"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.658476 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.658841 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.659128 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.659452 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zd2wv"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.659905 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.660459 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.660710 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.660877 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.661077 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.661169 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.661185 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.662037 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.662371 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.663862 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.665037 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.671165 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.671421 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.671635 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r62pr"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.671668 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.671840 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.671902 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.671975 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.672103 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.672213 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.672854 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.673087 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.674309 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.675377 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-z6t4b"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.675947 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.676687 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.677084 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.688350 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22q6g"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.689248 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.689546 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.690158 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.691232 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.691310 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.691539 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.691585 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.691638 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.691755 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.697789 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.697998 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.698190 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.698538 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.698727 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.698738 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.698856 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.699028 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.699109 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.699037 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.699637 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.699906 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.699955 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.700173 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.700283 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.701439 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.702063 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.702750 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.706879 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.706917 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.707856 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-75jc5"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.708297 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.708339 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.708925 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.709573 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.710428 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.710639 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.710773 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.711030 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5l6wr"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.711106 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.711447 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.711586 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.711856 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z8p7j"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.711958 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.712250 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.714638 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.714761 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.714955 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.715900 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.716676 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.717099 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.717492 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb65z"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.717865 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.718089 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.718589 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.719205 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5l465"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.719601 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.720436 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.720857 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.721117 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9bds2"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.721497 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.721887 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.723399 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.723422 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-24kpz"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.724190 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cz94k"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.724963 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r62pr"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.725711 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.727279 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.728521 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k2b65"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.728607 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.728964 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.730001 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.730751 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-serving-cert\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.730802 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7qk2\" (UniqueName: \"kubernetes.io/projected/41729c13-60fa-4005-9d4b-739071383860-kube-api-access-m7qk2\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.730969 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zd2wv"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731537 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6n4\" (UniqueName: \"kubernetes.io/projected/652ad7d4-fb59-48ec-936b-305fa0b0966e-kube-api-access-np6n4\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731585 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vd8j\" (UniqueName: \"kubernetes.io/projected/a529b2fb-a2b1-4320-9d37-31df46bf3246-kube-api-access-6vd8j\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731629 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc59d\" (UniqueName: \"kubernetes.io/projected/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-kube-api-access-mc59d\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731652 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46p9m\" (UniqueName: \"kubernetes.io/projected/7523755e-9f8c-4740-93d2-e35cc4f9757d-kube-api-access-46p9m\") pod \"cluster-samples-operator-665b6dd947-hl5k8\" (UID: \"7523755e-9f8c-4740-93d2-e35cc4f9757d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731692 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-image-import-ca\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731709 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-serving-cert\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731735 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768b0ebd-b1de-4848-900d-96d7ef81e650-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731775 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9488cfe-a86c-45af-9f75-5515ba6060ed-audit-dir\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731795 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr9z8\" (UniqueName: \"kubernetes.io/projected/d2293758-c295-487f-a399-678cc08cf5dc-kube-api-access-cr9z8\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731854 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f291fdb-786d-4cd1-b61f-f54de47908ff-node-pullsecrets\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731873 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731924 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9488cfe-a86c-45af-9f75-5515ba6060ed-audit-dir\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731898 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-serving-cert\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.731985 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-config\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732017 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f291fdb-786d-4cd1-b61f-f54de47908ff-node-pullsecrets\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732040 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732073 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-encryption-config\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732106 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/096f04e7-5491-45f6-9290-0a5bd7b7df49-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pb5ln\" (UID: \"096f04e7-5491-45f6-9290-0a5bd7b7df49\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732134 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/768b0ebd-b1de-4848-900d-96d7ef81e650-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732160 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-encryption-config\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732188 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcs6f\" (UniqueName: \"kubernetes.io/projected/1f291fdb-786d-4cd1-b61f-f54de47908ff-kube-api-access-tcs6f\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732212 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-client-ca\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732239 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dk6s\" (UniqueName: \"kubernetes.io/projected/cdc70bbe-b368-404a-8a3e-fafa31a91c0a-kube-api-access-2dk6s\") pod \"migrator-59844c95c7-mmk2s\" (UID: \"cdc70bbe-b368-404a-8a3e-fafa31a91c0a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732268 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5l6wr\" (UID: \"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732284 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-etcd-service-ca\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732302 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41729c13-60fa-4005-9d4b-739071383860-serving-cert\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732331 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzf6p\" (UniqueName: \"kubernetes.io/projected/bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1-kube-api-access-zzf6p\") pod \"multus-admission-controller-857f4d67dd-5l6wr\" (UID: \"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732355 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-audit\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732376 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-etcd-serving-ca\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732405 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b22b581-8f04-46de-9d8c-7661eae48179-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732420 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2293758-c295-487f-a399-678cc08cf5dc-etcd-client\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732434 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732448 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7523755e-9f8c-4740-93d2-e35cc4f9757d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hl5k8\" (UID: \"7523755e-9f8c-4740-93d2-e35cc4f9757d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732466 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-serving-cert\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732479 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-config\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732493 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/652ad7d4-fb59-48ec-936b-305fa0b0966e-images\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732508 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732525 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732541 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-service-ca-bundle\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732560 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-etcd-client\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732579 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-458d9\" (UniqueName: \"kubernetes.io/projected/b9488cfe-a86c-45af-9f75-5515ba6060ed-kube-api-access-458d9\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732598 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69pbs\" (UniqueName: \"kubernetes.io/projected/2b22b581-8f04-46de-9d8c-7661eae48179-kube-api-access-69pbs\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732613 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/652ad7d4-fb59-48ec-936b-305fa0b0966e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732628 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-config\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732634 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732654 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-etcd-client\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732694 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f291fdb-786d-4cd1-b61f-f54de47908ff-audit-dir\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732714 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-config\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732736 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-client-ca\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732753 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-srv-cert\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732797 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-audit-policies\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732814 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732842 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcbj\" (UniqueName: \"kubernetes.io/projected/096f04e7-5491-45f6-9290-0a5bd7b7df49-kube-api-access-4mcbj\") pod \"control-plane-machine-set-operator-78cbb6b69f-pb5ln\" (UID: \"096f04e7-5491-45f6-9290-0a5bd7b7df49\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732862 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln82q\" (UniqueName: \"kubernetes.io/projected/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-kube-api-access-ln82q\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732901 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/768b0ebd-b1de-4848-900d-96d7ef81e650-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732916 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2293758-c295-487f-a399-678cc08cf5dc-serving-cert\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732949 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-config\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732964 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-etcd-ca\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.732992 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652ad7d4-fb59-48ec-936b-305fa0b0966e-config\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.733008 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b22b581-8f04-46de-9d8c-7661eae48179-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.733079 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-image-import-ca\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.733257 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-config\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.733377 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f291fdb-786d-4cd1-b61f-f54de47908ff-audit-dir\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.733431 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.733785 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-config\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.734201 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-audit-policies\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.734554 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/652ad7d4-fb59-48ec-936b-305fa0b0966e-images\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.734647 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-config\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.734943 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-client-ca\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.734960 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5l6wr"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.735207 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652ad7d4-fb59-48ec-936b-305fa0b0966e-config\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.735541 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-serving-cert\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.735660 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.735685 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-encryption-config\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.736010 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-etcd-client\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.736024 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.736093 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41729c13-60fa-4005-9d4b-739071383860-service-ca-bundle\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.736291 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-serving-cert\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.736542 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-encryption-config\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.736645 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9488cfe-a86c-45af-9f75-5515ba6060ed-serving-cert\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.736805 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sw8t2"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.737435 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-config\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.737528 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/652ad7d4-fb59-48ec-936b-305fa0b0966e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.737542 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-etcd-serving-ca\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.737554 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-serving-cert\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.737466 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f291fdb-786d-4cd1-b61f-f54de47908ff-audit\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.737626 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-client-ca\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.737837 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41729c13-60fa-4005-9d4b-739071383860-serving-cert\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.737990 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pbp7k"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.738402 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.738626 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9488cfe-a86c-45af-9f75-5515ba6060ed-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.738719 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.738871 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f291fdb-786d-4cd1-b61f-f54de47908ff-etcd-client\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.739073 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7523755e-9f8c-4740-93d2-e35cc4f9757d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hl5k8\" (UID: \"7523755e-9f8c-4740-93d2-e35cc4f9757d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.739626 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.740409 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.741133 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.745231 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nlfgj"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.745260 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.745294 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.747093 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ntzq4"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.748696 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.748885 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.750300 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-z6t4b"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.752017 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2jdpb"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.752831 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.753903 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22q6g"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.754764 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb65z"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.755588 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.756449 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9bds2"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.757223 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.758028 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.758820 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.759640 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.760400 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zbkbh"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.760903 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbkbh" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.761244 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5l465"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.762051 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zbkbh"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.768555 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.796447 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.808287 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.817047 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hh2p9"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.819896 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pvqpc"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.820030 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.820691 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pvqpc"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.820778 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.820781 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hh2p9"] Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.831715 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.833821 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcbj\" (UniqueName: \"kubernetes.io/projected/096f04e7-5491-45f6-9290-0a5bd7b7df49-kube-api-access-4mcbj\") pod \"control-plane-machine-set-operator-78cbb6b69f-pb5ln\" (UID: \"096f04e7-5491-45f6-9290-0a5bd7b7df49\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.833943 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-etcd-ca\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834029 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/768b0ebd-b1de-4848-900d-96d7ef81e650-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834102 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2293758-c295-487f-a399-678cc08cf5dc-serving-cert\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834229 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-config\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834302 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b22b581-8f04-46de-9d8c-7661eae48179-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834475 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vd8j\" (UniqueName: \"kubernetes.io/projected/a529b2fb-a2b1-4320-9d37-31df46bf3246-kube-api-access-6vd8j\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834580 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768b0ebd-b1de-4848-900d-96d7ef81e650-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834650 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr9z8\" (UniqueName: \"kubernetes.io/projected/d2293758-c295-487f-a399-678cc08cf5dc-kube-api-access-cr9z8\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834731 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/096f04e7-5491-45f6-9290-0a5bd7b7df49-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pb5ln\" (UID: \"096f04e7-5491-45f6-9290-0a5bd7b7df49\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834861 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/768b0ebd-b1de-4848-900d-96d7ef81e650-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.834944 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dk6s\" (UniqueName: \"kubernetes.io/projected/cdc70bbe-b368-404a-8a3e-fafa31a91c0a-kube-api-access-2dk6s\") pod \"migrator-59844c95c7-mmk2s\" (UID: \"cdc70bbe-b368-404a-8a3e-fafa31a91c0a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835013 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5l6wr\" (UID: \"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835081 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-etcd-service-ca\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835151 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzf6p\" (UniqueName: \"kubernetes.io/projected/bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1-kube-api-access-zzf6p\") pod \"multus-admission-controller-857f4d67dd-5l6wr\" (UID: \"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835121 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768b0ebd-b1de-4848-900d-96d7ef81e650-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835220 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b22b581-8f04-46de-9d8c-7661eae48179-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835277 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2293758-c295-487f-a399-678cc08cf5dc-etcd-client\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835302 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835370 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69pbs\" (UniqueName: \"kubernetes.io/projected/2b22b581-8f04-46de-9d8c-7661eae48179-kube-api-access-69pbs\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.835416 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-srv-cert\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.849356 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.868396 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.874956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b22b581-8f04-46de-9d8c-7661eae48179-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.888370 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.908200 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.928610 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.948800 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.968559 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.978131 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b22b581-8f04-46de-9d8c-7661eae48179-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:42 crc kubenswrapper[4728]: I0125 05:40:42.989144 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.008651 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.029391 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.048391 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.069515 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.089579 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.096755 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/768b0ebd-b1de-4848-900d-96d7ef81e650-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.108938 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.133441 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.149524 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.154964 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-etcd-ca\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.168991 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.188679 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.196729 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2293758-c295-487f-a399-678cc08cf5dc-serving-cert\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.209027 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.218316 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2293758-c295-487f-a399-678cc08cf5dc-etcd-client\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.228390 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.234843 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-config\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.248280 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.255768 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2293758-c295-487f-a399-678cc08cf5dc-etcd-service-ca\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.268730 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.289201 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.308522 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.329096 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.349114 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.368383 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.389074 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.408675 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.428801 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.453220 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.468598 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.508857 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.528641 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.548825 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.569141 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.588904 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.608441 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.629258 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.648667 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.669087 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.688853 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.708355 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.727977 4728 request.go:700] Waited for 1.016830718s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmcc-proxy-tls&limit=500&resourceVersion=0 Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.728885 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.749104 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.768992 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.788229 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.808603 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.818209 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5l6wr\" (UID: \"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.829382 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 25 05:40:43 crc kubenswrapper[4728]: E0125 05:40:43.835360 4728 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Jan 25 05:40:43 crc kubenswrapper[4728]: E0125 05:40:43.835409 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/096f04e7-5491-45f6-9290-0a5bd7b7df49-control-plane-machine-set-operator-tls podName:096f04e7-5491-45f6-9290-0a5bd7b7df49 nodeName:}" failed. No retries permitted until 2026-01-25 05:40:44.335395322 +0000 UTC m=+135.371273303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/096f04e7-5491-45f6-9290-0a5bd7b7df49-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-pb5ln" (UID: "096f04e7-5491-45f6-9290-0a5bd7b7df49") : failed to sync secret cache: timed out waiting for the condition Jan 25 05:40:43 crc kubenswrapper[4728]: E0125 05:40:43.835416 4728 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Jan 25 05:40:43 crc kubenswrapper[4728]: E0125 05:40:43.835456 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-profile-collector-cert podName:a529b2fb-a2b1-4320-9d37-31df46bf3246 nodeName:}" failed. No retries permitted until 2026-01-25 05:40:44.335443854 +0000 UTC m=+135.371321845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-profile-collector-cert") pod "olm-operator-6b444d44fb-hplc9" (UID: "a529b2fb-a2b1-4320-9d37-31df46bf3246") : failed to sync secret cache: timed out waiting for the condition Jan 25 05:40:43 crc kubenswrapper[4728]: E0125 05:40:43.835508 4728 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 25 05:40:43 crc kubenswrapper[4728]: E0125 05:40:43.835543 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-srv-cert podName:a529b2fb-a2b1-4320-9d37-31df46bf3246 nodeName:}" failed. No retries permitted until 2026-01-25 05:40:44.335534597 +0000 UTC m=+135.371412587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-srv-cert") pod "olm-operator-6b444d44fb-hplc9" (UID: "a529b2fb-a2b1-4320-9d37-31df46bf3246") : failed to sync secret cache: timed out waiting for the condition Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.849514 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.868633 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.888438 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.909150 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.929364 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.948555 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.968697 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 25 05:40:43 crc kubenswrapper[4728]: I0125 05:40:43.989109 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.008345 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.033499 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.048893 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.069230 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.088399 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.108643 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.128990 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.149613 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.169076 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.188827 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.209270 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.228468 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.248776 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.269101 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.288881 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.309342 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.328470 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.348886 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.349811 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-srv-cert\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.349928 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/096f04e7-5491-45f6-9290-0a5bd7b7df49-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pb5ln\" (UID: \"096f04e7-5491-45f6-9290-0a5bd7b7df49\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.349975 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.352595 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-srv-cert\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.352718 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/096f04e7-5491-45f6-9290-0a5bd7b7df49-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pb5ln\" (UID: \"096f04e7-5491-45f6-9290-0a5bd7b7df49\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.352908 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a529b2fb-a2b1-4320-9d37-31df46bf3246-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.369176 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.420271 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7qk2\" (UniqueName: \"kubernetes.io/projected/41729c13-60fa-4005-9d4b-739071383860-kube-api-access-m7qk2\") pod \"authentication-operator-69f744f599-24kpz\" (UID: \"41729c13-60fa-4005-9d4b-739071383860\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.440142 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6n4\" (UniqueName: \"kubernetes.io/projected/652ad7d4-fb59-48ec-936b-305fa0b0966e-kube-api-access-np6n4\") pod \"machine-api-operator-5694c8668f-cz94k\" (UID: \"652ad7d4-fb59-48ec-936b-305fa0b0966e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.456600 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.459564 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc59d\" (UniqueName: \"kubernetes.io/projected/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-kube-api-access-mc59d\") pod \"route-controller-manager-6576b87f9c-l8872\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.480789 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46p9m\" (UniqueName: \"kubernetes.io/projected/7523755e-9f8c-4740-93d2-e35cc4f9757d-kube-api-access-46p9m\") pod \"cluster-samples-operator-665b6dd947-hl5k8\" (UID: \"7523755e-9f8c-4740-93d2-e35cc4f9757d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.500570 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln82q\" (UniqueName: \"kubernetes.io/projected/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-kube-api-access-ln82q\") pod \"controller-manager-879f6c89f-z8p7j\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.521240 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcs6f\" (UniqueName: \"kubernetes.io/projected/1f291fdb-786d-4cd1-b61f-f54de47908ff-kube-api-access-tcs6f\") pod \"apiserver-76f77b778f-sw8t2\" (UID: \"1f291fdb-786d-4cd1-b61f-f54de47908ff\") " pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.540462 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-458d9\" (UniqueName: \"kubernetes.io/projected/b9488cfe-a86c-45af-9f75-5515ba6060ed-kube-api-access-458d9\") pod \"apiserver-7bbb656c7d-8blvp\" (UID: \"b9488cfe-a86c-45af-9f75-5515ba6060ed\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.548990 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.566060 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-24kpz"] Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.568771 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 25 05:40:44 crc kubenswrapper[4728]: W0125 05:40:44.570396 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41729c13_60fa_4005_9d4b_739071383860.slice/crio-6e4cd4a085f14c7cc607e918871106e93f1c5471e945b336bb12e83ed06be0f1 WatchSource:0}: Error finding container 6e4cd4a085f14c7cc607e918871106e93f1c5471e945b336bb12e83ed06be0f1: Status 404 returned error can't find the container with id 6e4cd4a085f14c7cc607e918871106e93f1c5471e945b336bb12e83ed06be0f1 Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.588879 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.608908 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.629172 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.649299 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.668771 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.689608 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.708886 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.718021 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.723805 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.728448 4728 request.go:700] Waited for 1.90807204s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.729485 4728 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.731408 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.738207 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.743230 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.748765 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.766613 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.772335 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.786127 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" event={"ID":"41729c13-60fa-4005-9d4b-739071383860","Type":"ContainerStarted","Data":"ad30ff8fbdfd71c1a53516978e1bb0769dde86ca940cc3121fd87e567327146b"} Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.786172 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" event={"ID":"41729c13-60fa-4005-9d4b-739071383860","Type":"ContainerStarted","Data":"6e4cd4a085f14c7cc607e918871106e93f1c5471e945b336bb12e83ed06be0f1"} Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.789594 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.821367 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcbj\" (UniqueName: \"kubernetes.io/projected/096f04e7-5491-45f6-9290-0a5bd7b7df49-kube-api-access-4mcbj\") pod \"control-plane-machine-set-operator-78cbb6b69f-pb5ln\" (UID: \"096f04e7-5491-45f6-9290-0a5bd7b7df49\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.842628 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vd8j\" (UniqueName: \"kubernetes.io/projected/a529b2fb-a2b1-4320-9d37-31df46bf3246-kube-api-access-6vd8j\") pod \"olm-operator-6b444d44fb-hplc9\" (UID: \"a529b2fb-a2b1-4320-9d37-31df46bf3246\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.848658 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z8p7j"] Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.862428 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr9z8\" (UniqueName: \"kubernetes.io/projected/d2293758-c295-487f-a399-678cc08cf5dc-kube-api-access-cr9z8\") pod \"etcd-operator-b45778765-z6t4b\" (UID: \"d2293758-c295-487f-a399-678cc08cf5dc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.889690 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/768b0ebd-b1de-4848-900d-96d7ef81e650-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sbs8z\" (UID: \"768b0ebd-b1de-4848-900d-96d7ef81e650\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.900768 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.902070 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dk6s\" (UniqueName: \"kubernetes.io/projected/cdc70bbe-b368-404a-8a3e-fafa31a91c0a-kube-api-access-2dk6s\") pod \"migrator-59844c95c7-mmk2s\" (UID: \"cdc70bbe-b368-404a-8a3e-fafa31a91c0a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.907933 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.913915 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cz94k"] Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.920805 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzf6p\" (UniqueName: \"kubernetes.io/projected/bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1-kube-api-access-zzf6p\") pod \"multus-admission-controller-857f4d67dd-5l6wr\" (UID: \"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.930762 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.941391 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69pbs\" (UniqueName: \"kubernetes.io/projected/2b22b581-8f04-46de-9d8c-7661eae48179-kube-api-access-69pbs\") pod \"kube-storage-version-migrator-operator-b67b599dd-7v4ng\" (UID: \"2b22b581-8f04-46de-9d8c-7661eae48179\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.952353 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.970793 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.974333 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8"] Jan 25 05:40:44 crc kubenswrapper[4728]: I0125 05:40:44.976123 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.041021 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.057963 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jvdq\" (UniqueName: \"kubernetes.io/projected/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-kube-api-access-8jvdq\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.057992 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8402995d-516d-45ba-8f82-d99545fa7334-metrics-tls\") pod \"dns-operator-744455d44c-zd2wv\" (UID: \"8402995d-516d-45ba-8f82-d99545fa7334\") " pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058011 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kfrg\" (UniqueName: \"kubernetes.io/projected/b153c459-f101-48fe-9dd4-488371396842-kube-api-access-2kfrg\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058050 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2gtr\" (UniqueName: \"kubernetes.io/projected/8e426667-1f73-4e01-834a-7876d9495732-kube-api-access-x2gtr\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058066 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058081 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-trusted-ca-bundle\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058093 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vs94\" (UniqueName: \"kubernetes.io/projected/607307b0-b3c9-4a00-9347-a299a689c1c8-kube-api-access-8vs94\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058131 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058145 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-proxy-tls\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058159 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e620419c-1014-456e-9c25-98309e3dddb4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058195 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlzn9\" (UniqueName: \"kubernetes.io/projected/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-kube-api-access-nlzn9\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058209 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-tls\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058222 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/20bbec16-8855-4107-864f-386e34654e2d-signing-key\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058234 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a01fe6-fee0-4000-b260-4092c368fbc1-metrics-tls\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058278 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.058308 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7b9\" (UniqueName: \"kubernetes.io/projected/755592b0-bef9-4949-8614-144278dea776-kube-api-access-cb7b9\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060553 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/20bbec16-8855-4107-864f-386e34654e2d-signing-cabundle\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060587 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060640 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-secret-volume\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060662 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5q75\" (UniqueName: \"kubernetes.io/projected/b304a331-a7c7-43fc-8bc7-2b62330056f5-kube-api-access-h5q75\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060675 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060702 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060715 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060736 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-config\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060750 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060762 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1768785d-7da2-4694-a0ff-d010df0868f8-serving-cert\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060777 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1151360-e93c-468c-99a7-0343d2417cb0-apiservice-cert\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060796 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-config-volume\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060809 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee28691-2161-4afa-becc-1baaab86202d-config\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060822 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-dir\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060835 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060867 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8gsf\" (UniqueName: \"kubernetes.io/projected/95303aa9-3fb0-48a8-8df8-0f601653ac48-kube-api-access-s8gsf\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060882 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b153c459-f101-48fe-9dd4-488371396842-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060895 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhq8m\" (UniqueName: \"kubernetes.io/projected/20bbec16-8855-4107-864f-386e34654e2d-kube-api-access-lhq8m\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060907 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b153c459-f101-48fe-9dd4-488371396842-serving-cert\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060921 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-certificates\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060934 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-serving-cert\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060948 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1768785d-7da2-4694-a0ff-d010df0868f8-config\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060960 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/755592b0-bef9-4949-8614-144278dea776-proxy-tls\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060983 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e620419c-1014-456e-9c25-98309e3dddb4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.060996 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmpjl\" (UniqueName: \"kubernetes.io/projected/be8794a5-e8f0-4216-ade3-6def48bd8859-kube-api-access-zmpjl\") pod \"downloads-7954f5f757-nlfgj\" (UID: \"be8794a5-e8f0-4216-ade3-6def48bd8859\") " pod="openshift-console/downloads-7954f5f757-nlfgj" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061008 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1768785d-7da2-4694-a0ff-d010df0868f8-trusted-ca\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061023 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-stats-auth\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061037 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b304a331-a7c7-43fc-8bc7-2b62330056f5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061050 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e426667-1f73-4e01-834a-7876d9495732-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061064 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83a01fe6-fee0-4000-b260-4092c368fbc1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061084 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2adde18-8a3e-4272-aa9d-e0585c6c5f3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6vp6n\" (UID: \"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061096 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-console-config\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061109 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061135 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061150 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-bound-sa-token\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061163 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-service-ca-bundle\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061176 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6whtr\" (UniqueName: \"kubernetes.io/projected/3a9908d7-639d-4f34-a59d-e0a03231a620-kube-api-access-6whtr\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061188 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-config\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061208 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ee28691-2161-4afa-becc-1baaab86202d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061230 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab144c0a-a5be-4645-84db-cfab4a00241c-profile-collector-cert\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061258 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf8mb\" (UniqueName: \"kubernetes.io/projected/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-kube-api-access-zf8mb\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061287 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jb76\" (UniqueName: \"kubernetes.io/projected/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-kube-api-access-6jb76\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061301 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-oauth-serving-cert\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061344 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab144c0a-a5be-4645-84db-cfab4a00241c-srv-cert\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061368 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qts8g\" (UniqueName: \"kubernetes.io/projected/ab144c0a-a5be-4645-84db-cfab4a00241c-kube-api-access-qts8g\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061399 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061415 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-serving-cert\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061428 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-service-ca\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061441 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-metrics-certs\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061465 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b84x5\" (UniqueName: \"kubernetes.io/projected/83a01fe6-fee0-4000-b260-4092c368fbc1-kube-api-access-b84x5\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061532 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c4pb\" (UniqueName: \"kubernetes.io/projected/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-kube-api-access-4c4pb\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061564 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b304a331-a7c7-43fc-8bc7-2b62330056f5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061581 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-images\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061619 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-machine-approver-tls\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061650 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv856\" (UniqueName: \"kubernetes.io/projected/b1151360-e93c-468c-99a7-0343d2417cb0-kube-api-access-mv856\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061685 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqpfq\" (UniqueName: \"kubernetes.io/projected/8402995d-516d-45ba-8f82-d99545fa7334-kube-api-access-fqpfq\") pod \"dns-operator-744455d44c-zd2wv\" (UID: \"8402995d-516d-45ba-8f82-d99545fa7334\") " pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061703 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-auth-proxy-config\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061727 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e426667-1f73-4e01-834a-7876d9495732-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061767 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-oauth-config\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061818 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-policies\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061835 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061860 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-config\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061958 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvwk6\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-kube-api-access-gvwk6\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.061988 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2mm\" (UniqueName: \"kubernetes.io/projected/e620419c-1014-456e-9c25-98309e3dddb4-kube-api-access-fp2mm\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062008 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062027 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83a01fe6-fee0-4000-b260-4092c368fbc1-trusted-ca\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062041 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e620419c-1014-456e-9c25-98309e3dddb4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062752 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062768 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvvdj\" (UniqueName: \"kubernetes.io/projected/1768785d-7da2-4694-a0ff-d010df0868f8-kube-api-access-wvvdj\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062784 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1151360-e93c-468c-99a7-0343d2417cb0-webhook-cert\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062911 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062927 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062957 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-default-certificate\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.062972 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-trusted-ca\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.063022 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxfqp\" (UniqueName: \"kubernetes.io/projected/b2adde18-8a3e-4272-aa9d-e0585c6c5f3f-kube-api-access-rxfqp\") pod \"package-server-manager-789f6589d5-6vp6n\" (UID: \"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.063077 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/755592b0-bef9-4949-8614-144278dea776-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.063115 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.063130 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b1151360-e93c-468c-99a7-0343d2417cb0-tmpfs\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.063145 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ee28691-2161-4afa-becc-1baaab86202d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.063193 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:45.563179101 +0000 UTC m=+136.599057082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.078780 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-z6t4b"] Jan 25 05:40:45 crc kubenswrapper[4728]: W0125 05:40:45.091349 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2293758_c295_487f_a399_678cc08cf5dc.slice/crio-b994121db3c397248f15886b5986fd41bfcee526c60f11b78669bdf28d8e0823 WatchSource:0}: Error finding container b994121db3c397248f15886b5986fd41bfcee526c60f11b78669bdf28d8e0823: Status 404 returned error can't find the container with id b994121db3c397248f15886b5986fd41bfcee526c60f11b78669bdf28d8e0823 Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.134727 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.137207 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sw8t2"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.140584 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163496 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.163596 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:45.663578909 +0000 UTC m=+136.699456889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163667 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8402995d-516d-45ba-8f82-d99545fa7334-metrics-tls\") pod \"dns-operator-744455d44c-zd2wv\" (UID: \"8402995d-516d-45ba-8f82-d99545fa7334\") " pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163687 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jvdq\" (UniqueName: \"kubernetes.io/projected/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-kube-api-access-8jvdq\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163704 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c41d03a-3cfe-44b3-be40-7946406279c5-cert\") pod \"ingress-canary-zbkbh\" (UID: \"1c41d03a-3cfe-44b3-be40-7946406279c5\") " pod="openshift-ingress-canary/ingress-canary-zbkbh" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163718 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-metrics-tls\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163734 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kfrg\" (UniqueName: \"kubernetes.io/projected/b153c459-f101-48fe-9dd4-488371396842-kube-api-access-2kfrg\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163749 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2gtr\" (UniqueName: \"kubernetes.io/projected/8e426667-1f73-4e01-834a-7876d9495732-kube-api-access-x2gtr\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163764 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163777 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-proxy-tls\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163794 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-plugins-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163810 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-trusted-ca-bundle\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163825 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vs94\" (UniqueName: \"kubernetes.io/projected/607307b0-b3c9-4a00-9347-a299a689c1c8-kube-api-access-8vs94\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163840 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163879 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e620419c-1014-456e-9c25-98309e3dddb4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163894 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlzn9\" (UniqueName: \"kubernetes.io/projected/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-kube-api-access-nlzn9\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163908 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl2xr\" (UniqueName: \"kubernetes.io/projected/179b5a3c-109c-46ca-a175-167cb3e32b8b-kube-api-access-dl2xr\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163923 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-tls\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163937 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/20bbec16-8855-4107-864f-386e34654e2d-signing-key\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163952 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a01fe6-fee0-4000-b260-4092c368fbc1-metrics-tls\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163977 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.163995 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7b9\" (UniqueName: \"kubernetes.io/projected/755592b0-bef9-4949-8614-144278dea776-kube-api-access-cb7b9\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164025 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/20bbec16-8855-4107-864f-386e34654e2d-signing-cabundle\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164041 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164063 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-secret-volume\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164078 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5q75\" (UniqueName: \"kubernetes.io/projected/b304a331-a7c7-43fc-8bc7-2b62330056f5-kube-api-access-h5q75\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164106 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164121 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-config\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164135 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164150 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164172 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1151360-e93c-468c-99a7-0343d2417cb0-apiservice-cert\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164188 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1768785d-7da2-4694-a0ff-d010df0868f8-serving-cert\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164203 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-config-volume\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164216 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-csi-data-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164230 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-config-volume\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164252 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee28691-2161-4afa-becc-1baaab86202d-config\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164267 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-dir\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164282 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164297 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8gsf\" (UniqueName: \"kubernetes.io/projected/95303aa9-3fb0-48a8-8df8-0f601653ac48-kube-api-access-s8gsf\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.164312 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b153c459-f101-48fe-9dd4-488371396842-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167348 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhq8m\" (UniqueName: \"kubernetes.io/projected/20bbec16-8855-4107-864f-386e34654e2d-kube-api-access-lhq8m\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167368 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b153c459-f101-48fe-9dd4-488371396842-serving-cert\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167385 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-certificates\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167399 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e620419c-1014-456e-9c25-98309e3dddb4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167413 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-serving-cert\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167427 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1768785d-7da2-4694-a0ff-d010df0868f8-config\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167448 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/755592b0-bef9-4949-8614-144278dea776-proxy-tls\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167463 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmpjl\" (UniqueName: \"kubernetes.io/projected/be8794a5-e8f0-4216-ade3-6def48bd8859-kube-api-access-zmpjl\") pod \"downloads-7954f5f757-nlfgj\" (UID: \"be8794a5-e8f0-4216-ade3-6def48bd8859\") " pod="openshift-console/downloads-7954f5f757-nlfgj" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167477 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-stats-auth\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167490 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1768785d-7da2-4694-a0ff-d010df0868f8-trusted-ca\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167505 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b304a331-a7c7-43fc-8bc7-2b62330056f5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167520 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e426667-1f73-4e01-834a-7876d9495732-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167570 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83a01fe6-fee0-4000-b260-4092c368fbc1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.165501 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-dir\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.166002 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee28691-2161-4afa-becc-1baaab86202d-config\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.166305 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-config-volume\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167793 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-mountpoint-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167814 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167832 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167859 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2adde18-8a3e-4272-aa9d-e0585c6c5f3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6vp6n\" (UID: \"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167875 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-console-config\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167891 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167907 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-bound-sa-token\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167928 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-service-ca-bundle\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167943 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6whtr\" (UniqueName: \"kubernetes.io/projected/3a9908d7-639d-4f34-a59d-e0a03231a620-kube-api-access-6whtr\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167958 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ee28691-2161-4afa-becc-1baaab86202d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167973 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-config\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.167989 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab144c0a-a5be-4645-84db-cfab4a00241c-profile-collector-cert\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168011 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf8mb\" (UniqueName: \"kubernetes.io/projected/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-kube-api-access-zf8mb\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168026 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jb76\" (UniqueName: \"kubernetes.io/projected/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-kube-api-access-6jb76\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168041 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-socket-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168056 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab144c0a-a5be-4645-84db-cfab4a00241c-srv-cert\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168069 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-oauth-serving-cert\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168090 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qts8g\" (UniqueName: \"kubernetes.io/projected/ab144c0a-a5be-4645-84db-cfab4a00241c-kube-api-access-qts8g\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168107 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-metrics-certs\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168124 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168139 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-serving-cert\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168153 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-service-ca\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168383 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b84x5\" (UniqueName: \"kubernetes.io/projected/83a01fe6-fee0-4000-b260-4092c368fbc1-kube-api-access-b84x5\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168403 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c4pb\" (UniqueName: \"kubernetes.io/projected/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-kube-api-access-4c4pb\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168418 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b304a331-a7c7-43fc-8bc7-2b62330056f5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168434 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-registration-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168450 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-images\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168466 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-machine-approver-tls\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168482 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv856\" (UniqueName: \"kubernetes.io/projected/b1151360-e93c-468c-99a7-0343d2417cb0-kube-api-access-mv856\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168498 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqpfq\" (UniqueName: \"kubernetes.io/projected/8402995d-516d-45ba-8f82-d99545fa7334-kube-api-access-fqpfq\") pod \"dns-operator-744455d44c-zd2wv\" (UID: \"8402995d-516d-45ba-8f82-d99545fa7334\") " pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168513 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-auth-proxy-config\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168527 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e426667-1f73-4e01-834a-7876d9495732-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168543 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-oauth-config\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168560 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-policies\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168576 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168591 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-config\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168607 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvwk6\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-kube-api-access-gvwk6\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168623 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2mm\" (UniqueName: \"kubernetes.io/projected/e620419c-1014-456e-9c25-98309e3dddb4-kube-api-access-fp2mm\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168639 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168655 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83a01fe6-fee0-4000-b260-4092c368fbc1-trusted-ca\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168670 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzxfb\" (UniqueName: \"kubernetes.io/projected/ba22e26f-3207-4eea-83c9-cbe417c3a521-kube-api-access-zzxfb\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168685 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdgc2\" (UniqueName: \"kubernetes.io/projected/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-kube-api-access-mdgc2\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168700 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e620419c-1014-456e-9c25-98309e3dddb4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168724 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168740 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvvdj\" (UniqueName: \"kubernetes.io/projected/1768785d-7da2-4694-a0ff-d010df0868f8-kube-api-access-wvvdj\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168755 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1151360-e93c-468c-99a7-0343d2417cb0-webhook-cert\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168770 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168794 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168809 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/179b5a3c-109c-46ca-a175-167cb3e32b8b-certs\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168825 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-trusted-ca\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.168930 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8402995d-516d-45ba-8f82-d99545fa7334-metrics-tls\") pod \"dns-operator-744455d44c-zd2wv\" (UID: \"8402995d-516d-45ba-8f82-d99545fa7334\") " pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169018 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-default-certificate\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169055 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxfqp\" (UniqueName: \"kubernetes.io/projected/b2adde18-8a3e-4272-aa9d-e0585c6c5f3f-kube-api-access-rxfqp\") pod \"package-server-manager-789f6589d5-6vp6n\" (UID: \"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169074 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/755592b0-bef9-4949-8614-144278dea776-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169089 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/179b5a3c-109c-46ca-a175-167cb3e32b8b-node-bootstrap-token\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169114 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b1151360-e93c-468c-99a7-0343d2417cb0-tmpfs\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169130 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169146 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgjg4\" (UniqueName: \"kubernetes.io/projected/1c41d03a-3cfe-44b3-be40-7946406279c5-kube-api-access-qgjg4\") pod \"ingress-canary-zbkbh\" (UID: \"1c41d03a-3cfe-44b3-be40-7946406279c5\") " pod="openshift-ingress-canary/ingress-canary-zbkbh" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169161 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ee28691-2161-4afa-becc-1baaab86202d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169436 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a01fe6-fee0-4000-b260-4092c368fbc1-metrics-tls\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.169655 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.170933 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.171056 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-trusted-ca\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.171154 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ee28691-2161-4afa-becc-1baaab86202d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.171199 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.171525 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b153c459-f101-48fe-9dd4-488371396842-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.171736 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-auth-proxy-config\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.171937 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.172038 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.172209 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.172310 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-service-ca-bundle\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.172668 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.172750 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.173119 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/755592b0-bef9-4949-8614-144278dea776-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.173245 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83a01fe6-fee0-4000-b260-4092c368fbc1-trusted-ca\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.173262 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-config\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.173489 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-trusted-ca-bundle\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.173597 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.174107 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-policies\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.174309 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-config\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.175127 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.176052 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1768785d-7da2-4694-a0ff-d010df0868f8-trusted-ca\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.176159 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-default-certificate\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.176492 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b304a331-a7c7-43fc-8bc7-2b62330056f5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.177560 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b1151360-e93c-468c-99a7-0343d2417cb0-tmpfs\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.177883 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b153c459-f101-48fe-9dd4-488371396842-serving-cert\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.179564 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/20bbec16-8855-4107-864f-386e34654e2d-signing-cabundle\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.179831 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e426667-1f73-4e01-834a-7876d9495732-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.181592 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e426667-1f73-4e01-834a-7876d9495732-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.181727 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.181950 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.182128 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-oauth-config\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.182287 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-config\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.182309 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-proxy-tls\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.182752 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-stats-auth\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.182773 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-service-ca\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.183058 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-secret-volume\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.183253 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e620419c-1014-456e-9c25-98309e3dddb4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.183492 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.183636 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-console-config\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.183760 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.184647 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-certificates\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.184975 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-images\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.186213 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab144c0a-a5be-4645-84db-cfab4a00241c-profile-collector-cert\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.186498 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e620419c-1014-456e-9c25-98309e3dddb4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.187313 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-oauth-serving-cert\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.187503 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:45.687493327 +0000 UTC m=+136.723371307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.187770 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.190154 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-machine-approver-tls\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.190413 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab144c0a-a5be-4645-84db-cfab4a00241c-srv-cert\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.190867 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-serving-cert\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.191425 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.191688 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.191731 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-serving-cert\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.191918 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1151360-e93c-468c-99a7-0343d2417cb0-webhook-cert\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.193350 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1768785d-7da2-4694-a0ff-d010df0868f8-serving-cert\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.194574 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.194783 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/20bbec16-8855-4107-864f-386e34654e2d-signing-key\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.195158 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1768785d-7da2-4694-a0ff-d010df0868f8-config\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.195593 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-tls\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.195614 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b304a331-a7c7-43fc-8bc7-2b62330056f5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.195866 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1151360-e93c-468c-99a7-0343d2417cb0-apiservice-cert\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.195896 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.196840 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.198097 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-metrics-certs\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.201734 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jvdq\" (UniqueName: \"kubernetes.io/projected/9b2df1fa-0f35-450e-afa6-0456f31f3fcd-kube-api-access-8jvdq\") pod \"machine-config-operator-74547568cd-rfc4h\" (UID: \"9b2df1fa-0f35-450e-afa6-0456f31f3fcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.203205 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/755592b0-bef9-4949-8614-144278dea776-proxy-tls\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.209480 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2adde18-8a3e-4272-aa9d-e0585c6c5f3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6vp6n\" (UID: \"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:45 crc kubenswrapper[4728]: W0125 05:40:45.213898 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod096f04e7_5491_45f6_9290_0a5bd7b7df49.slice/crio-e826c289e2485dd90dc99be22ae3a5833585166fac3405577000246e81451cf3 WatchSource:0}: Error finding container e826c289e2485dd90dc99be22ae3a5833585166fac3405577000246e81451cf3: Status 404 returned error can't find the container with id e826c289e2485dd90dc99be22ae3a5833585166fac3405577000246e81451cf3 Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.223835 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7b9\" (UniqueName: \"kubernetes.io/projected/755592b0-bef9-4949-8614-144278dea776-kube-api-access-cb7b9\") pod \"machine-config-controller-84d6567774-24g5b\" (UID: \"755592b0-bef9-4949-8614-144278dea776\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.242447 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.242677 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83a01fe6-fee0-4000-b260-4092c368fbc1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.247665 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.261401 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kfrg\" (UniqueName: \"kubernetes.io/projected/b153c459-f101-48fe-9dd4-488371396842-kube-api-access-2kfrg\") pod \"openshift-config-operator-7777fb866f-k2b65\" (UID: \"b153c459-f101-48fe-9dd4-488371396842\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.270510 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.270672 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:45.770660261 +0000 UTC m=+136.806538241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.270953 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-csi-data-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.270984 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-config-volume\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271054 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-mountpoint-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271118 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-socket-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271148 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271192 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-registration-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271233 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzxfb\" (UniqueName: \"kubernetes.io/projected/ba22e26f-3207-4eea-83c9-cbe417c3a521-kube-api-access-zzxfb\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271248 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdgc2\" (UniqueName: \"kubernetes.io/projected/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-kube-api-access-mdgc2\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271272 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/179b5a3c-109c-46ca-a175-167cb3e32b8b-certs\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/179b5a3c-109c-46ca-a175-167cb3e32b8b-node-bootstrap-token\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271335 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgjg4\" (UniqueName: \"kubernetes.io/projected/1c41d03a-3cfe-44b3-be40-7946406279c5-kube-api-access-qgjg4\") pod \"ingress-canary-zbkbh\" (UID: \"1c41d03a-3cfe-44b3-be40-7946406279c5\") " pod="openshift-ingress-canary/ingress-canary-zbkbh" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271350 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c41d03a-3cfe-44b3-be40-7946406279c5-cert\") pod \"ingress-canary-zbkbh\" (UID: \"1c41d03a-3cfe-44b3-be40-7946406279c5\") " pod="openshift-ingress-canary/ingress-canary-zbkbh" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271368 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-metrics-tls\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271381 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-plugins-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.271409 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl2xr\" (UniqueName: \"kubernetes.io/projected/179b5a3c-109c-46ca-a175-167cb3e32b8b-kube-api-access-dl2xr\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.272473 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-config-volume\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.272535 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-csi-data-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.272866 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:45.772857659 +0000 UTC m=+136.808735639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.272910 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-mountpoint-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.273409 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-socket-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.274035 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-registration-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.274141 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba22e26f-3207-4eea-83c9-cbe417c3a521-plugins-dir\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.275812 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c41d03a-3cfe-44b3-be40-7946406279c5-cert\") pod \"ingress-canary-zbkbh\" (UID: \"1c41d03a-3cfe-44b3-be40-7946406279c5\") " pod="openshift-ingress-canary/ingress-canary-zbkbh" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.280679 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/179b5a3c-109c-46ca-a175-167cb3e32b8b-certs\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.281056 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-metrics-tls\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.281779 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/179b5a3c-109c-46ca-a175-167cb3e32b8b-node-bootstrap-token\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.294908 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2gtr\" (UniqueName: \"kubernetes.io/projected/8e426667-1f73-4e01-834a-7876d9495732-kube-api-access-x2gtr\") pod \"openshift-controller-manager-operator-756b6f6bc6-qn586\" (UID: \"8e426667-1f73-4e01-834a-7876d9495732\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.310687 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmpjl\" (UniqueName: \"kubernetes.io/projected/be8794a5-e8f0-4216-ade3-6def48bd8859-kube-api-access-zmpjl\") pod \"downloads-7954f5f757-nlfgj\" (UID: \"be8794a5-e8f0-4216-ade3-6def48bd8859\") " pod="openshift-console/downloads-7954f5f757-nlfgj" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.335730 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv856\" (UniqueName: \"kubernetes.io/projected/b1151360-e93c-468c-99a7-0343d2417cb0-kube-api-access-mv856\") pod \"packageserver-d55dfcdfc-ql8cn\" (UID: \"b1151360-e93c-468c-99a7-0343d2417cb0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.346981 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-bound-sa-token\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.352184 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.353266 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5l6wr"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.367305 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqpfq\" (UniqueName: \"kubernetes.io/projected/8402995d-516d-45ba-8f82-d99545fa7334-kube-api-access-fqpfq\") pod \"dns-operator-744455d44c-zd2wv\" (UID: \"8402995d-516d-45ba-8f82-d99545fa7334\") " pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.375582 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.375687 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:45.875673459 +0000 UTC m=+136.911551429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.375873 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.376157 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:45.876151283 +0000 UTC m=+136.912029263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.390456 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8gsf\" (UniqueName: \"kubernetes.io/projected/95303aa9-3fb0-48a8-8df8-0f601653ac48-kube-api-access-s8gsf\") pod \"marketplace-operator-79b997595-qb65z\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.410389 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhq8m\" (UniqueName: \"kubernetes.io/projected/20bbec16-8855-4107-864f-386e34654e2d-kube-api-access-lhq8m\") pod \"service-ca-9c57cc56f-5l465\" (UID: \"20bbec16-8855-4107-864f-386e34654e2d\") " pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.410525 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.425165 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nlfgj" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.431778 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e620419c-1014-456e-9c25-98309e3dddb4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.447026 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6whtr\" (UniqueName: \"kubernetes.io/projected/3a9908d7-639d-4f34-a59d-e0a03231a620-kube-api-access-6whtr\") pod \"oauth-openshift-558db77b4-r62pr\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.458033 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.466713 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.467915 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ee28691-2161-4afa-becc-1baaab86202d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pzn76\" (UID: \"3ee28691-2161-4afa-becc-1baaab86202d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.471213 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.477842 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.478430 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:45.978369673 +0000 UTC m=+137.014247652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.483164 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.483259 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.485125 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2mm\" (UniqueName: \"kubernetes.io/projected/e620419c-1014-456e-9c25-98309e3dddb4-kube-api-access-fp2mm\") pod \"cluster-image-registry-operator-dc59b4c8b-bd6wc\" (UID: \"e620419c-1014-456e-9c25-98309e3dddb4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.488137 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.507255 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxfqp\" (UniqueName: \"kubernetes.io/projected/b2adde18-8a3e-4272-aa9d-e0585c6c5f3f-kube-api-access-rxfqp\") pod \"package-server-manager-789f6589d5-6vp6n\" (UID: \"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.527085 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22f81257-1e8d-425b-9bd4-ad46a9b9ad8a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2d28d\" (UID: \"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.540047 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h"] Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.557808 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvvdj\" (UniqueName: \"kubernetes.io/projected/1768785d-7da2-4694-a0ff-d010df0868f8-kube-api-access-wvvdj\") pod \"console-operator-58897d9998-2jdpb\" (UID: \"1768785d-7da2-4694-a0ff-d010df0868f8\") " pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.565109 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.568538 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vs94\" (UniqueName: \"kubernetes.io/projected/607307b0-b3c9-4a00-9347-a299a689c1c8-kube-api-access-8vs94\") pod \"console-f9d7485db-ntzq4\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.579749 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.580271 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.080253909 +0000 UTC m=+137.116131889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.583283 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.591842 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.593947 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5l465" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.613879 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c4pb\" (UniqueName: \"kubernetes.io/projected/b4b43f81-6cbe-45f7-95d8-cd60fb107efd-kube-api-access-4c4pb\") pod \"router-default-5444994796-75jc5\" (UID: \"b4b43f81-6cbe-45f7-95d8-cd60fb107efd\") " pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.639713 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvwk6\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-kube-api-access-gvwk6\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.648612 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5q75\" (UniqueName: \"kubernetes.io/projected/b304a331-a7c7-43fc-8bc7-2b62330056f5-kube-api-access-h5q75\") pod \"openshift-apiserver-operator-796bbdcf4f-r67kb\" (UID: \"b304a331-a7c7-43fc-8bc7-2b62330056f5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.666225 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b84x5\" (UniqueName: \"kubernetes.io/projected/83a01fe6-fee0-4000-b260-4092c368fbc1-kube-api-access-b84x5\") pod \"ingress-operator-5b745b69d9-cns5h\" (UID: \"83a01fe6-fee0-4000-b260-4092c368fbc1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.684618 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.684846 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.184825481 +0000 UTC m=+137.220703461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.687876 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.688184 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.188172806 +0000 UTC m=+137.224050785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.702990 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jb76\" (UniqueName: \"kubernetes.io/projected/5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3-kube-api-access-6jb76\") pod \"service-ca-operator-777779d784-9bds2\" (UID: \"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.718878 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.735588 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.756955 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf8mb\" (UniqueName: \"kubernetes.io/projected/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-kube-api-access-zf8mb\") pod \"collect-profiles-29488650-h9sxs\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.757186 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.762821 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qts8g\" (UniqueName: \"kubernetes.io/projected/ab144c0a-a5be-4645-84db-cfab4a00241c-kube-api-access-qts8g\") pod \"catalog-operator-68c6474976-lkwgp\" (UID: \"ab144c0a-a5be-4645-84db-cfab4a00241c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.764555 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.775360 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlzn9\" (UniqueName: \"kubernetes.io/projected/2f6eb12b-af90-4df1-94dd-2452d57c1ab7-kube-api-access-nlzn9\") pod \"machine-approver-56656f9798-r68kw\" (UID: \"2f6eb12b-af90-4df1-94dd-2452d57c1ab7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.775890 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl2xr\" (UniqueName: \"kubernetes.io/projected/179b5a3c-109c-46ca-a175-167cb3e32b8b-kube-api-access-dl2xr\") pod \"machine-config-server-pbp7k\" (UID: \"179b5a3c-109c-46ca-a175-167cb3e32b8b\") " pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.777081 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.789232 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.789524 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.289512492 +0000 UTC m=+137.325390472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.802362 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" event={"ID":"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1","Type":"ContainerStarted","Data":"ad0f5aa84967339b9a00e69f3a5015b5aa5c4e20fa45bb81a8beab9f5c7ad236"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.802393 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" event={"ID":"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1","Type":"ContainerStarted","Data":"e65e62bffe4e2dd6d387505df805d652059a85f9eff4ccc4f3bde07af64c9e11"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.808266 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" event={"ID":"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417","Type":"ContainerStarted","Data":"acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.808294 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" event={"ID":"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417","Type":"ContainerStarted","Data":"eecdfc9ff88756a89a98eb4a7f632e0acead1b8ed5d748addf1cf1aea83bbd2e"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.809681 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.811081 4728 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-l8872 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.811129 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" podUID="9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.812761 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdgc2\" (UniqueName: \"kubernetes.io/projected/406a4abe-01c4-41a0-a28e-c4c0fc1f1b44-kube-api-access-mdgc2\") pod \"dns-default-pvqpc\" (UID: \"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44\") " pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.815594 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgjg4\" (UniqueName: \"kubernetes.io/projected/1c41d03a-3cfe-44b3-be40-7946406279c5-kube-api-access-qgjg4\") pod \"ingress-canary-zbkbh\" (UID: \"1c41d03a-3cfe-44b3-be40-7946406279c5\") " pod="openshift-ingress-canary/ingress-canary-zbkbh" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.822784 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" event={"ID":"7523755e-9f8c-4740-93d2-e35cc4f9757d","Type":"ContainerStarted","Data":"79598a7287184c148e8a6e6ca59ed32f666e763e5ad2be804897b631dd898bfa"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.822815 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" event={"ID":"7523755e-9f8c-4740-93d2-e35cc4f9757d","Type":"ContainerStarted","Data":"fc6d90539c65a8b7dc64ef343e0d34add452d7b418f5ea34e68d9406f08263f2"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.822826 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" event={"ID":"7523755e-9f8c-4740-93d2-e35cc4f9757d","Type":"ContainerStarted","Data":"c9006b4d7dd89b7b0c99822bded1fb49f170a9d68e28b4da23cd8107666bd889"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.825092 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.839472 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.843886 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" event={"ID":"cdc70bbe-b368-404a-8a3e-fafa31a91c0a","Type":"ContainerStarted","Data":"b9d024f34d02195d1c74df03b795476b164943dab8c1a91ff668105fa1e9430c"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.843993 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" event={"ID":"cdc70bbe-b368-404a-8a3e-fafa31a91c0a","Type":"ContainerStarted","Data":"cfbd77e76c0113d4cdc756d039e04e77424b83be070dd15da2328c5251bed1c6"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.846748 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" event={"ID":"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401","Type":"ContainerStarted","Data":"bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.846784 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" event={"ID":"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401","Type":"ContainerStarted","Data":"df21f70529fbb031c7d8937ecb878fbe86d9d40b88954a65348b969a14181775"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.847183 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.849247 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" event={"ID":"d2293758-c295-487f-a399-678cc08cf5dc","Type":"ContainerStarted","Data":"959a1b5041cbbfc9da1b51a4e4d52d84e7b2eb49e7513e874fe45f1b08b42e2f"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.849300 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" event={"ID":"d2293758-c295-487f-a399-678cc08cf5dc","Type":"ContainerStarted","Data":"b994121db3c397248f15886b5986fd41bfcee526c60f11b78669bdf28d8e0823"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.853542 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.855199 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" event={"ID":"652ad7d4-fb59-48ec-936b-305fa0b0966e","Type":"ContainerStarted","Data":"cc302abd5010f9d9e14cbe4549526c1ea5385eb055b1e49bc7ddfcccdeab1682"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.855229 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" event={"ID":"652ad7d4-fb59-48ec-936b-305fa0b0966e","Type":"ContainerStarted","Data":"7ae033aa079af0015c931b36701cc2f16bdd19fb77dfbb0b5d0644a05856fd4b"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.855241 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" event={"ID":"652ad7d4-fb59-48ec-936b-305fa0b0966e","Type":"ContainerStarted","Data":"1e8f04f3611c38c55d7fa8f363e90809ec8ebf5424e88965600056f93139e0a4"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.856476 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" event={"ID":"755592b0-bef9-4949-8614-144278dea776","Type":"ContainerStarted","Data":"923424ba6d740a4dd80901751984d013f0ae2c59d00f28d93e7a7f85b5b682db"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.859011 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.860573 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzxfb\" (UniqueName: \"kubernetes.io/projected/ba22e26f-3207-4eea-83c9-cbe417c3a521-kube-api-access-zzxfb\") pod \"csi-hostpathplugin-hh2p9\" (UID: \"ba22e26f-3207-4eea-83c9-cbe417c3a521\") " pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.860905 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" event={"ID":"096f04e7-5491-45f6-9290-0a5bd7b7df49","Type":"ContainerStarted","Data":"061df9a7a2a7f4c170efd92b61f6538a5a5d4e5706208a654a3bacf2fed4b98b"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.860970 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" event={"ID":"096f04e7-5491-45f6-9290-0a5bd7b7df49","Type":"ContainerStarted","Data":"e826c289e2485dd90dc99be22ae3a5833585166fac3405577000246e81451cf3"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.869442 4728 generic.go:334] "Generic (PLEG): container finished" podID="1f291fdb-786d-4cd1-b61f-f54de47908ff" containerID="5b1f94e6a3851deee09e7e4cb18a1f0677d320631bc2374b7cdbc73336252362" exitCode=0 Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.869514 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" event={"ID":"1f291fdb-786d-4cd1-b61f-f54de47908ff","Type":"ContainerDied","Data":"5b1f94e6a3851deee09e7e4cb18a1f0677d320631bc2374b7cdbc73336252362"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.869540 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" event={"ID":"1f291fdb-786d-4cd1-b61f-f54de47908ff","Type":"ContainerStarted","Data":"a0a83e31958adc8b24f7580d6a7333a07390eefd3e22a1544efc014efe6e927c"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.871380 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" event={"ID":"a529b2fb-a2b1-4320-9d37-31df46bf3246","Type":"ContainerStarted","Data":"fd9a9e9035173cf90830148706af9bee81950e503e61863483465ad4cfc111b2"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.871406 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" event={"ID":"a529b2fb-a2b1-4320-9d37-31df46bf3246","Type":"ContainerStarted","Data":"3d3b2a4593018ab3df0a72c4756f6ab532e30b543ccf65b2308f50f2a23a0ab3"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.871607 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.873092 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" event={"ID":"2b22b581-8f04-46de-9d8c-7661eae48179","Type":"ContainerStarted","Data":"4363109d3d20f9a51cc3b2cc064b9ff9c9218a3110314dbce1aeaeef10cc8c61"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.875423 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" event={"ID":"9b2df1fa-0f35-450e-afa6-0456f31f3fcd","Type":"ContainerStarted","Data":"ab085cabe94cd45fcaf0c63f66c43a2f97c5fd66068dbbf1490f8ea1f792c58d"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.877963 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.879147 4728 generic.go:334] "Generic (PLEG): container finished" podID="b9488cfe-a86c-45af-9f75-5515ba6060ed" containerID="c996a8b3aaf754f94e6bd549e54141082dac16fde54f69ccfd9658f62a94ee48" exitCode=0 Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.879197 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" event={"ID":"b9488cfe-a86c-45af-9f75-5515ba6060ed","Type":"ContainerDied","Data":"c996a8b3aaf754f94e6bd549e54141082dac16fde54f69ccfd9658f62a94ee48"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.879215 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" event={"ID":"b9488cfe-a86c-45af-9f75-5515ba6060ed","Type":"ContainerStarted","Data":"89e8b0234d2c5038955dcc08c8bbfd5073c84b101fba3f5c764fbaefed03ee11"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.883241 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" event={"ID":"768b0ebd-b1de-4848-900d-96d7ef81e650","Type":"ContainerStarted","Data":"c04f27a938a62c2f1d15fbdad375450252482c66814f128301d0c07f89621536"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.883264 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" event={"ID":"768b0ebd-b1de-4848-900d-96d7ef81e650","Type":"ContainerStarted","Data":"2b13749d118d43ab9761d1b998fa57a11542149fa7cb578b6ddea586e2bced5c"} Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.894709 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.897180 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.397168532 +0000 UTC m=+137.433046511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.903098 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.905702 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.911678 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pbp7k" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.917859 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbkbh" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.941803 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.946462 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.975417 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" Jan 25 05:40:45 crc kubenswrapper[4728]: I0125 05:40:45.995553 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:45 crc kubenswrapper[4728]: E0125 05:40:45.995890 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.495875415 +0000 UTC m=+137.531753395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.096164 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.096466 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.596454612 +0000 UTC m=+137.632332593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.198386 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.198882 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.698867179 +0000 UTC m=+137.734745158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.306800 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.307716 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.807701508 +0000 UTC m=+137.843579488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.415384 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.416165 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.916128326 +0000 UTC m=+137.952006306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.419136 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.419745 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:46.919730453 +0000 UTC m=+137.955608433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.520239 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.522172 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.022157207 +0000 UTC m=+138.058035187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.541112 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" podStartSLOduration=115.541083014 podStartE2EDuration="1m55.541083014s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:46.535398636 +0000 UTC m=+137.571276616" watchObservedRunningTime="2026-01-25 05:40:46.541083014 +0000 UTC m=+137.576960994" Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.567775 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586"] Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.587792 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nlfgj"] Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.607580 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zd2wv"] Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.611378 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sbs8z" podStartSLOduration=116.611357238 podStartE2EDuration="1m56.611357238s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:46.606878071 +0000 UTC m=+137.642756052" watchObservedRunningTime="2026-01-25 05:40:46.611357238 +0000 UTC m=+137.647235218" Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.625241 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.625930 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.125896695 +0000 UTC m=+138.161774674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.668882 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k2b65"] Jan 25 05:40:46 crc kubenswrapper[4728]: W0125 05:40:46.697583 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e426667_1f73_4e01_834a_7876d9495732.slice/crio-479ba67c00e4f8ee543b4942a1da920bf9d4dcb6f9dc8fbc1d403a3a58405fde WatchSource:0}: Error finding container 479ba67c00e4f8ee543b4942a1da920bf9d4dcb6f9dc8fbc1d403a3a58405fde: Status 404 returned error can't find the container with id 479ba67c00e4f8ee543b4942a1da920bf9d4dcb6f9dc8fbc1d403a3a58405fde Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.734998 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.735349 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.235332764 +0000 UTC m=+138.271210743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.805268 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-z6t4b" podStartSLOduration=116.80523826 podStartE2EDuration="1m56.80523826s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:46.80054094 +0000 UTC m=+137.836418920" watchObservedRunningTime="2026-01-25 05:40:46.80523826 +0000 UTC m=+137.841116239" Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.839884 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.840164 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.340152014 +0000 UTC m=+138.376029993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.907241 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" event={"ID":"b9488cfe-a86c-45af-9f75-5515ba6060ed","Type":"ContainerStarted","Data":"0a94e829721dd89fd76120bd89b6610ba1d8f679a7a65c288b8f933a7700a330"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.913095 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" event={"ID":"8e426667-1f73-4e01-834a-7876d9495732","Type":"ContainerStarted","Data":"479ba67c00e4f8ee543b4942a1da920bf9d4dcb6f9dc8fbc1d403a3a58405fde"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.926050 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-75jc5" event={"ID":"b4b43f81-6cbe-45f7-95d8-cd60fb107efd","Type":"ContainerStarted","Data":"0175afbc089ecb30e7c5d9bee194f6a52d4e1972fbff113f29f89f7acadfc7bf"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.926086 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-75jc5" event={"ID":"b4b43f81-6cbe-45f7-95d8-cd60fb107efd","Type":"ContainerStarted","Data":"a65407951419fbca045f93e09c978b086d7812a53b1fc754f46e67f85d253076"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.929907 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" event={"ID":"2b22b581-8f04-46de-9d8c-7661eae48179","Type":"ContainerStarted","Data":"1ca727f4266df7bee0cf8ef18b58ae4fe0fdf387f38682e086c49a97f2f7947d"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.931720 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" event={"ID":"b153c459-f101-48fe-9dd4-488371396842","Type":"ContainerStarted","Data":"5b5abcaafae0f7a7845472de8114709489470dbb586bf129a1ac81b517d87979"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.934520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" event={"ID":"2f6eb12b-af90-4df1-94dd-2452d57c1ab7","Type":"ContainerStarted","Data":"b5b1254d6b09212426d3560494219f86eb6e6331722036b287e05039a457e673"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.935454 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nlfgj" event={"ID":"be8794a5-e8f0-4216-ade3-6def48bd8859","Type":"ContainerStarted","Data":"a278d2e87293e3db4765340e11bee9e3b38cad26aed2381b8598737f045ea80c"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.936487 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" event={"ID":"9b2df1fa-0f35-450e-afa6-0456f31f3fcd","Type":"ContainerStarted","Data":"ee558e21a3995b5c27afae0b91c6fc9a7e64db22be74b7d9c23629863fd9b876"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.939296 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pbp7k" event={"ID":"179b5a3c-109c-46ca-a175-167cb3e32b8b","Type":"ContainerStarted","Data":"92d615cfdf418ce199c9f7b2dc1397f88457d432cf96fc6f80613565e7e0c508"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.939334 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pbp7k" event={"ID":"179b5a3c-109c-46ca-a175-167cb3e32b8b","Type":"ContainerStarted","Data":"e93903f2eece236caf52b1299fa618c3f9b0411974ae6ccea7b9db0c0494c3d9"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.940304 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.941932 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.441897478 +0000 UTC m=+138.477775458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.944127 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:46 crc kubenswrapper[4728]: E0125 05:40:46.944457 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.444446802 +0000 UTC m=+138.480324782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.945642 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" event={"ID":"1f291fdb-786d-4cd1-b61f-f54de47908ff","Type":"ContainerStarted","Data":"d87d471c13967aca51a6e3487bb9fe88ef6193f86efe94df2a95650ce064e0ab"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.946973 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pb5ln" podStartSLOduration=115.946959237 podStartE2EDuration="1m55.946959237s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:46.944090427 +0000 UTC m=+137.979968408" watchObservedRunningTime="2026-01-25 05:40:46.946959237 +0000 UTC m=+137.982837217" Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.953649 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" event={"ID":"755592b0-bef9-4949-8614-144278dea776","Type":"ContainerStarted","Data":"496204f74f9273b1a32be4306d01c987620ad08ac3e095a39ba4aed4f98a1611"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.967963 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" event={"ID":"cdc70bbe-b368-404a-8a3e-fafa31a91c0a","Type":"ContainerStarted","Data":"6159f2158491f551f538a81bdff2587efb033f672cafa9e78312df5d545d5016"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.976877 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" event={"ID":"bd78ece9-7fa3-40b3-b4cf-2b4533e37eb1","Type":"ContainerStarted","Data":"5dfc148fe5321fef99e00b82d7a86c03e364c7241a5879e9a44f0f5f2b46ed16"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.985698 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" event={"ID":"8402995d-516d-45ba-8f82-d99545fa7334","Type":"ContainerStarted","Data":"b0b863e02372bb13150417846c7fb47cc8e9feb7337b1048a7cf0208273a85f5"} Jan 25 05:40:46 crc kubenswrapper[4728]: I0125 05:40:46.994912 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.045157 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.046920 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.546854018 +0000 UTC m=+138.582731999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.082655 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" podStartSLOduration=117.082635415 podStartE2EDuration="1m57.082635415s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.081282555 +0000 UTC m=+138.117160534" watchObservedRunningTime="2026-01-25 05:40:47.082635415 +0000 UTC m=+138.118513395" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.155818 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.156244 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.656232749 +0000 UTC m=+138.692110729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.220251 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-24kpz" podStartSLOduration=117.220230544 podStartE2EDuration="1m57.220230544s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.178471896 +0000 UTC m=+138.214349876" watchObservedRunningTime="2026-01-25 05:40:47.220230544 +0000 UTC m=+138.256108524" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.220726 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hplc9" podStartSLOduration=117.220723258 podStartE2EDuration="1m57.220723258s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.219945015 +0000 UTC m=+138.255822995" watchObservedRunningTime="2026-01-25 05:40:47.220723258 +0000 UTC m=+138.256601238" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.257267 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.257924 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.757907639 +0000 UTC m=+138.793785619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.263234 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.279083 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r62pr"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.315533 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.315886 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hl5k8" podStartSLOduration=117.315869402 podStartE2EDuration="1m57.315869402s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.308860267 +0000 UTC m=+138.344738248" watchObservedRunningTime="2026-01-25 05:40:47.315869402 +0000 UTC m=+138.351747382" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.358830 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.359308 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.85929746 +0000 UTC m=+138.895175439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.359591 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.377275 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ntzq4"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.405381 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5l465"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.417640 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb65z"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.459513 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.459811 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:47.959795393 +0000 UTC m=+138.995673373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.561616 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.561927 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.061916688 +0000 UTC m=+139.097794669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.562795 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cz94k" podStartSLOduration=116.562780523 podStartE2EDuration="1m56.562780523s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.562281478 +0000 UTC m=+138.598159458" watchObservedRunningTime="2026-01-25 05:40:47.562780523 +0000 UTC m=+138.598658503" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.566710 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.624530 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.626698 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.628258 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.645314 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zbkbh"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.645367 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pvqpc"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.646212 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-75jc5" podStartSLOduration=117.646202269 podStartE2EDuration="1m57.646202269s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.645206425 +0000 UTC m=+138.681084404" watchObservedRunningTime="2026-01-25 05:40:47.646202269 +0000 UTC m=+138.682080249" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.647973 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9bds2"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.651968 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hh2p9"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.662800 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.663208 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.163195258 +0000 UTC m=+139.199073238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: W0125 05:40:47.666626 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a01fe6_fee0_4000_b260_4092c368fbc1.slice/crio-8bb55a961d6a4cffc78fc5f8382f53209752274f38ca031441249d2306f412b9 WatchSource:0}: Error finding container 8bb55a961d6a4cffc78fc5f8382f53209752274f38ca031441249d2306f412b9: Status 404 returned error can't find the container with id 8bb55a961d6a4cffc78fc5f8382f53209752274f38ca031441249d2306f412b9 Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.702738 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" podStartSLOduration=117.702574805 podStartE2EDuration="1m57.702574805s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.676825914 +0000 UTC m=+138.712703884" watchObservedRunningTime="2026-01-25 05:40:47.702574805 +0000 UTC m=+138.738452825" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.703648 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.719155 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.745195 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7v4ng" podStartSLOduration=117.745174254 podStartE2EDuration="1m57.745174254s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.726687758 +0000 UTC m=+138.762565738" watchObservedRunningTime="2026-01-25 05:40:47.745174254 +0000 UTC m=+138.781052234" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.764298 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.764640 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.264628681 +0000 UTC m=+139.300506662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.808096 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5l6wr" podStartSLOduration=117.808077488 podStartE2EDuration="1m57.808077488s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.798660978 +0000 UTC m=+138.834538958" watchObservedRunningTime="2026-01-25 05:40:47.808077488 +0000 UTC m=+138.843955469" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.811843 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2jdpb"] Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.844305 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.856416 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pbp7k" podStartSLOduration=5.856400771 podStartE2EDuration="5.856400771s" podCreationTimestamp="2026-01-25 05:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.855731424 +0000 UTC m=+138.891609405" watchObservedRunningTime="2026-01-25 05:40:47.856400771 +0000 UTC m=+138.892278751" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.860523 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:47 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:47 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:47 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.860585 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.867166 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.868081 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.368050337 +0000 UTC m=+139.403928318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.977247 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" podStartSLOduration=116.977227737 podStartE2EDuration="1m56.977227737s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.926477211 +0000 UTC m=+138.962355190" watchObservedRunningTime="2026-01-25 05:40:47.977227737 +0000 UTC m=+139.013105717" Jan 25 05:40:47 crc kubenswrapper[4728]: I0125 05:40:47.979820 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:47 crc kubenswrapper[4728]: E0125 05:40:47.980103 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.480091736 +0000 UTC m=+139.515969717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.014811 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mmk2s" podStartSLOduration=118.014791707 podStartE2EDuration="1m58.014791707s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:47.977832091 +0000 UTC m=+139.013710071" watchObservedRunningTime="2026-01-25 05:40:48.014791707 +0000 UTC m=+139.050669686" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.053472 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" event={"ID":"3ee28691-2161-4afa-becc-1baaab86202d","Type":"ContainerStarted","Data":"43fcfcdb26fdc6743f27f2f1893896ebedc19a34e7ac3d4d2262dbf806cff6bf"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.059851 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5l465" event={"ID":"20bbec16-8855-4107-864f-386e34654e2d","Type":"ContainerStarted","Data":"9124d8a7c23e76af87183d1b36e5cc2d956bdd1d2fdf4cffc57fc1122ac58530"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.067066 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" event={"ID":"8402995d-516d-45ba-8f82-d99545fa7334","Type":"ContainerStarted","Data":"b95d3259cc9bfbe944882e39069a9f258c64e6bc2c0acaf560b12665d1fbb7da"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.072854 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" event={"ID":"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f","Type":"ContainerStarted","Data":"611458d466087021f3c62eda6783a167bdeebefd3c37df3a9f43471880ba21f1"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.072888 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" event={"ID":"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f","Type":"ContainerStarted","Data":"04efb88f89b308d6b212c12e81ec7f1eecfedf009203ed590b632dfa1566f22b"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.073445 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" event={"ID":"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a","Type":"ContainerStarted","Data":"97146cd0267ef278a2e780fbe567c532a0f7c4556fb78215d425dbb99601e7ad"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.075538 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" event={"ID":"b304a331-a7c7-43fc-8bc7-2b62330056f5","Type":"ContainerStarted","Data":"12e29b34f03e820dd9c8e5d2f0739ae172f32c94f0d8e6121df60e80ccd154e1"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.080862 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.081142 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.581129822 +0000 UTC m=+139.617007792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.082091 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nlfgj" event={"ID":"be8794a5-e8f0-4216-ade3-6def48bd8859","Type":"ContainerStarted","Data":"ac4b8c36ece2843457cdf483eaa0e6acffb6e96cbf4b1ebc3302db864d19bfe1"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.082645 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nlfgj" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.086154 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" podStartSLOduration=118.086141577 podStartE2EDuration="1m58.086141577s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.016129569 +0000 UTC m=+139.052007549" watchObservedRunningTime="2026-01-25 05:40:48.086141577 +0000 UTC m=+139.122019556" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.086398 4728 patch_prober.go:28] interesting pod/downloads-7954f5f757-nlfgj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.086435 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nlfgj" podUID="be8794a5-e8f0-4216-ade3-6def48bd8859" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.086776 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" podStartSLOduration=118.086771179 podStartE2EDuration="1m58.086771179s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.085690613 +0000 UTC m=+139.121568593" watchObservedRunningTime="2026-01-25 05:40:48.086771179 +0000 UTC m=+139.122649159" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.093748 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rfc4h" event={"ID":"9b2df1fa-0f35-450e-afa6-0456f31f3fcd","Type":"ContainerStarted","Data":"962f9af27c35176a769160a59d3c804e071c250feb9ea39689311da5a41ce99d"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.101571 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" event={"ID":"ba22e26f-3207-4eea-83c9-cbe417c3a521","Type":"ContainerStarted","Data":"dfee693c83cac683aea9cdc320dacc12d06a7dd08e3005a9bfb993fd411394aa"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.112431 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" event={"ID":"8e426667-1f73-4e01-834a-7876d9495732","Type":"ContainerStarted","Data":"45ff3d6e2726968b643981b45adfc0b05f11214b6a09abd0d7b35d8382af9f3f"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.113556 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nlfgj" podStartSLOduration=118.113546471 podStartE2EDuration="1m58.113546471s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.112527772 +0000 UTC m=+139.148405753" watchObservedRunningTime="2026-01-25 05:40:48.113546471 +0000 UTC m=+139.149424451" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.118060 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" event={"ID":"e620419c-1014-456e-9c25-98309e3dddb4","Type":"ContainerStarted","Data":"17ac128bbbe5e1c0dc3b23efa8e53b1386d583674405e8412a5d31855d1d92ec"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.123201 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" event={"ID":"83a01fe6-fee0-4000-b260-4092c368fbc1","Type":"ContainerStarted","Data":"8bb55a961d6a4cffc78fc5f8382f53209752274f38ca031441249d2306f412b9"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.170185 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24g5b" event={"ID":"755592b0-bef9-4949-8614-144278dea776","Type":"ContainerStarted","Data":"a47362a34c74d8b0ffb96afd5490b0100bd1e2c3d1fb5139103b92d4f3ff056a"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.175866 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" event={"ID":"2f6eb12b-af90-4df1-94dd-2452d57c1ab7","Type":"ContainerStarted","Data":"ad2137b47df38e890421c9de16629c5a3bfdd589a3ba8a662ca68803391b3b83"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.175900 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" event={"ID":"2f6eb12b-af90-4df1-94dd-2452d57c1ab7","Type":"ContainerStarted","Data":"9533c568bc829070d76624ebb23156230f8545e8fcc25195ed3b4b129b6ff46a"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.184545 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.185447 4728 generic.go:334] "Generic (PLEG): container finished" podID="b153c459-f101-48fe-9dd4-488371396842" containerID="f03db9a210ae5d7ab6f2d676f10f15522be65bc9600032504be237f55c111853" exitCode=0 Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.185853 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" event={"ID":"b153c459-f101-48fe-9dd4-488371396842","Type":"ContainerDied","Data":"f03db9a210ae5d7ab6f2d676f10f15522be65bc9600032504be237f55c111853"} Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.186513 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.686495107 +0000 UTC m=+139.722373087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.199757 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pvqpc" event={"ID":"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44","Type":"ContainerStarted","Data":"92873b38894e6c88cafb27d88eb4e1a93de1f6e8a40490b167487145852d6830"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.211860 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qn586" podStartSLOduration=118.211838659 podStartE2EDuration="1m58.211838659s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.13690635 +0000 UTC m=+139.172784330" watchObservedRunningTime="2026-01-25 05:40:48.211838659 +0000 UTC m=+139.247716639" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.212110 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r68kw" podStartSLOduration=118.212106426 podStartE2EDuration="1m58.212106426s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.209996263 +0000 UTC m=+139.245874243" watchObservedRunningTime="2026-01-25 05:40:48.212106426 +0000 UTC m=+139.247984406" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.216593 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zbkbh" event={"ID":"1c41d03a-3cfe-44b3-be40-7946406279c5","Type":"ContainerStarted","Data":"e525ad142caaa9e57260fc54d330e7fb89563a9ddc9c48ed81aa846ea0b7ed64"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.247708 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" event={"ID":"3a9908d7-639d-4f34-a59d-e0a03231a620","Type":"ContainerStarted","Data":"93eee35971ec372c3de4487632a075752eae4c0c8108ebbb00f740b1b5e6d077"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.248505 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.262714 4728 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r62pr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.262743 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" podUID="3a9908d7-639d-4f34-a59d-e0a03231a620" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.262809 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" event={"ID":"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68","Type":"ContainerStarted","Data":"cc1f453e707064c313e5bec008cad4e2e41239e2dc506bc441d650951faa1acf"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.271520 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" podStartSLOduration=118.271511501 podStartE2EDuration="1m58.271511501s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.269384024 +0000 UTC m=+139.305262004" watchObservedRunningTime="2026-01-25 05:40:48.271511501 +0000 UTC m=+139.307389481" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.283690 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" event={"ID":"b1151360-e93c-468c-99a7-0343d2417cb0","Type":"ContainerStarted","Data":"1f8e6ba32cda857483fc94f5d22caeee0419cd989e031a716e3d5623809f5fb7"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.283732 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" event={"ID":"b1151360-e93c-468c-99a7-0343d2417cb0","Type":"ContainerStarted","Data":"78be316798ee125b3a007e814ac8e85e1302ac2a53d2782e6d06f2d0252b3c0b"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.284484 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.286416 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" event={"ID":"95303aa9-3fb0-48a8-8df8-0f601653ac48","Type":"ContainerStarted","Data":"30cb1327d3b79775309161b1127dd15937b5fd2347dcb12170d3c9c768fdac64"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.286929 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.292640 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.294209 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.794190901 +0000 UTC m=+139.830068882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.294709 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" podStartSLOduration=118.294690156 podStartE2EDuration="1m58.294690156s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.292865753 +0000 UTC m=+139.328743733" watchObservedRunningTime="2026-01-25 05:40:48.294690156 +0000 UTC m=+139.330568136" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.296170 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ntzq4" event={"ID":"607307b0-b3c9-4a00-9347-a299a689c1c8","Type":"ContainerStarted","Data":"e6d8128d4e4fc874f92f75806092eeab8f00e39fb32b8662f69daefc2dfe2060"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.296781 4728 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qb65z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.296811 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" podUID="95303aa9-3fb0-48a8-8df8-0f601653ac48" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.296843 4728 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ql8cn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.296898 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" podUID="b1151360-e93c-468c-99a7-0343d2417cb0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.318294 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" event={"ID":"ab144c0a-a5be-4645-84db-cfab4a00241c","Type":"ContainerStarted","Data":"2dfb369938bca5f649c13acb81d191992e7f0a6eca1268666b57c1685355b2d4"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.326027 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" podStartSLOduration=117.326016201 podStartE2EDuration="1m57.326016201s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.318883101 +0000 UTC m=+139.354761082" watchObservedRunningTime="2026-01-25 05:40:48.326016201 +0000 UTC m=+139.361894182" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.360070 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" event={"ID":"1f291fdb-786d-4cd1-b61f-f54de47908ff","Type":"ContainerStarted","Data":"9215eb9122b498b9ef385bc80a2b376c8af91fee295e1382ceae9771c43b4881"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.382236 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" podStartSLOduration=117.382215088 podStartE2EDuration="1m57.382215088s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.354977511 +0000 UTC m=+139.390855490" watchObservedRunningTime="2026-01-25 05:40:48.382215088 +0000 UTC m=+139.418093068" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.384182 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ntzq4" podStartSLOduration=118.384175468 podStartE2EDuration="1m58.384175468s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.377372593 +0000 UTC m=+139.413250573" watchObservedRunningTime="2026-01-25 05:40:48.384175468 +0000 UTC m=+139.420053448" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.395007 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.396653 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.896638043 +0000 UTC m=+139.932516023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.408784 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" podStartSLOduration=118.408771256 podStartE2EDuration="1m58.408771256s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.407424567 +0000 UTC m=+139.443302547" watchObservedRunningTime="2026-01-25 05:40:48.408771256 +0000 UTC m=+139.444649236" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.422573 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" event={"ID":"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3","Type":"ContainerStarted","Data":"4f9200fda7751a78e93d40667d536277d4e3444486586d4c255c2bc5263bfa7c"} Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.445776 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" podStartSLOduration=117.445754367 podStartE2EDuration="1m57.445754367s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:48.44448805 +0000 UTC m=+139.480366030" watchObservedRunningTime="2026-01-25 05:40:48.445754367 +0000 UTC m=+139.481632347" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.496386 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.497508 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:48.997489696 +0000 UTC m=+140.033367676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.588967 4728 csr.go:261] certificate signing request csr-dxczf is approved, waiting to be issued Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.597756 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.598115 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.098102306 +0000 UTC m=+140.133980286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.600149 4728 csr.go:257] certificate signing request csr-dxczf is issued Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.698723 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.698916 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.198891119 +0000 UTC m=+140.234769099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.699410 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.699732 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.199718936 +0000 UTC m=+140.235596915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.800600 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.800820 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.300784884 +0000 UTC m=+140.336662863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.843737 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:48 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:48 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:48 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.843809 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:48 crc kubenswrapper[4728]: I0125 05:40:48.901956 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:48 crc kubenswrapper[4728]: E0125 05:40:48.902212 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.402201255 +0000 UTC m=+140.438079235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.003534 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.003700 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.503676317 +0000 UTC m=+140.539554297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.003968 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.004251 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.504240585 +0000 UTC m=+140.540118566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.105192 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.105376 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.605350887 +0000 UTC m=+140.641228867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.105515 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.105774 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.605763798 +0000 UTC m=+140.641641778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.206342 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.206498 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.706477389 +0000 UTC m=+140.742355369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.206589 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.207008 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.706990731 +0000 UTC m=+140.742868711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.307169 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.307347 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.807307781 +0000 UTC m=+140.843185761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.307792 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.308096 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.808081516 +0000 UTC m=+140.843959496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.410716 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.411049 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:49.911037681 +0000 UTC m=+140.946915661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.445351 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" event={"ID":"e620419c-1014-456e-9c25-98309e3dddb4","Type":"ContainerStarted","Data":"b8e9f691fdc2248cbec3839876d187dd93ec97455c62585dc1ac4f41dba2d0a8"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.468192 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2jdpb" event={"ID":"1768785d-7da2-4694-a0ff-d010df0868f8","Type":"ContainerStarted","Data":"61a593a7f8ed12384eaeb34657c61ce7cf3ff447fba15852654965efcd3f0b4b"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.468241 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2jdpb" event={"ID":"1768785d-7da2-4694-a0ff-d010df0868f8","Type":"ContainerStarted","Data":"1b1cfb72fc1206bd5dfea9b37a775dfd4e9e11a14460b2beeae59e3479fc55cf"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.469667 4728 patch_prober.go:28] interesting pod/console-operator-58897d9998-2jdpb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.469701 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2jdpb" podUID="1768785d-7da2-4694-a0ff-d010df0868f8" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.469914 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.487912 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" event={"ID":"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68","Type":"ContainerStarted","Data":"c9f667c9c53b8d3ea2f9be2f47a6b9a0955d6c9ee583815ebe2bc20d8a86d114"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.515097 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" event={"ID":"ba22e26f-3207-4eea-83c9-cbe417c3a521","Type":"ContainerStarted","Data":"0c3b27939fdb705fd83e739552112ba7994c79509315a577b98e7ef76da41bc6"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.515139 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" event={"ID":"ba22e26f-3207-4eea-83c9-cbe417c3a521","Type":"ContainerStarted","Data":"d4dbea271e729e9a9253515fafe0c93aef0ec0f45a16e94eca80d78f8666b229"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.516112 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.516950 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.0169389 +0000 UTC m=+141.052816880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.547262 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" event={"ID":"b2adde18-8a3e-4272-aa9d-e0585c6c5f3f","Type":"ContainerStarted","Data":"9406cb749934b408e17f94b16783d0fef7cfdfea3c2b83ede3da96d7e9658f8e"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.547933 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.575048 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" event={"ID":"22f81257-1e8d-425b-9bd4-ad46a9b9ad8a","Type":"ContainerStarted","Data":"3ae9614dab1de5289ffaa41b0cbd5d845bbbbe822bc1021912661735497544e0"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.584544 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" event={"ID":"b304a331-a7c7-43fc-8bc7-2b62330056f5","Type":"ContainerStarted","Data":"c5adc6e9ae333d2171a7270ca8e35d36c9f357bfc1048098cb231e271a34a13c"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.605410 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-25 05:35:48 +0000 UTC, rotation deadline is 2026-11-05 16:50:25.036866823 +0000 UTC Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.605452 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6827h9m35.431416945s for next certificate rotation Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.606081 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" event={"ID":"3ee28691-2161-4afa-becc-1baaab86202d","Type":"ContainerStarted","Data":"e8f453909caa25205262b042080170a06435a516b1561032a23a8aafbae7207e"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.614591 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2jdpb" podStartSLOduration=119.6145768 podStartE2EDuration="1m59.6145768s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.613310253 +0000 UTC m=+140.649188233" watchObservedRunningTime="2026-01-25 05:40:49.6145768 +0000 UTC m=+140.650454769" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.622002 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.623098 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.123087476 +0000 UTC m=+141.158965456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.624637 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" event={"ID":"3a9908d7-639d-4f34-a59d-e0a03231a620","Type":"ContainerStarted","Data":"24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.630363 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.641707 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zbkbh" event={"ID":"1c41d03a-3cfe-44b3-be40-7946406279c5","Type":"ContainerStarted","Data":"e801446d3b75a4c3253c94b54750dbb4dc07c68e6dd6d2b5df9d0a64d7a0649c"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.648817 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zd2wv" event={"ID":"8402995d-516d-45ba-8f82-d99545fa7334","Type":"ContainerStarted","Data":"5ff35910852afff2dc86d77a9c0eef54405f528b083db0599eae93064ea79e38"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.665366 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ntzq4" event={"ID":"607307b0-b3c9-4a00-9347-a299a689c1c8","Type":"ContainerStarted","Data":"cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.679614 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" event={"ID":"ab144c0a-a5be-4645-84db-cfab4a00241c","Type":"ContainerStarted","Data":"77fbae79365baab3bcf03a1fba1e5266e1fec93f45dbaaef4bc8b04092a2cd1d"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.680708 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.683445 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" podStartSLOduration=119.683433592 podStartE2EDuration="1m59.683433592s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.657578441 +0000 UTC m=+140.693456421" watchObservedRunningTime="2026-01-25 05:40:49.683433592 +0000 UTC m=+140.719311572" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.684719 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bd6wc" podStartSLOduration=119.684710369 podStartE2EDuration="1m59.684710369s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.683133184 +0000 UTC m=+140.719011164" watchObservedRunningTime="2026-01-25 05:40:49.684710369 +0000 UTC m=+140.720588349" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.691204 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.697457 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" event={"ID":"b153c459-f101-48fe-9dd4-488371396842","Type":"ContainerStarted","Data":"8a8b08060b29fd226aca7277a37b1fe0082d584b28c808f228b50ed823529d97"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.698019 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.710798 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9bds2" event={"ID":"5d7358b8-9e6d-44ec-b4ed-6c4d629b98f3","Type":"ContainerStarted","Data":"4f2839a648bfcbcd8d985b60538a31677c3020deb4025e8d55046f2b0ad3ad89"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.713692 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" event={"ID":"95303aa9-3fb0-48a8-8df8-0f601653ac48","Type":"ContainerStarted","Data":"6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.714431 4728 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qb65z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.714471 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" podUID="95303aa9-3fb0-48a8-8df8-0f601653ac48" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.718301 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5l465" event={"ID":"20bbec16-8855-4107-864f-386e34654e2d","Type":"ContainerStarted","Data":"5d19c105cfb5bbf1c09875cd023e28cb69160005df70a98eb0a3d4ece2875900"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.723457 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.724965 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.224950934 +0000 UTC m=+141.260828914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.730814 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" event={"ID":"83a01fe6-fee0-4000-b260-4092c368fbc1","Type":"ContainerStarted","Data":"1ff625cea9d7f113c948eb663b342f05d230ca2e2051203291e094ed85b4b4a5"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.731060 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" event={"ID":"83a01fe6-fee0-4000-b260-4092c368fbc1","Type":"ContainerStarted","Data":"7bd7f0f9b7dc6844e198e8ee9447bb7ff3a4e4759686a113f457902ad4bd620f"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.734854 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pvqpc" event={"ID":"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44","Type":"ContainerStarted","Data":"6e6cb0b6f7a8c36eeb6f17e0919d9eb6b77b42076b30533bde4c8df46193c378"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.734882 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pvqpc" event={"ID":"406a4abe-01c4-41a0-a28e-c4c0fc1f1b44","Type":"ContainerStarted","Data":"e2008b558d8be12577f661b3da2ed289f3c300b0d806f396a984e1e71690805f"} Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.734904 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pvqpc" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.737072 4728 patch_prober.go:28] interesting pod/downloads-7954f5f757-nlfgj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.737121 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nlfgj" podUID="be8794a5-e8f0-4216-ade3-6def48bd8859" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.740000 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.740701 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.745148 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.745175 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.746824 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ql8cn" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.750673 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.759753 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zbkbh" podStartSLOduration=7.759743048 podStartE2EDuration="7.759743048s" podCreationTimestamp="2026-01-25 05:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.758773203 +0000 UTC m=+140.794651183" watchObservedRunningTime="2026-01-25 05:40:49.759743048 +0000 UTC m=+140.795621028" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.760027 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r67kb" podStartSLOduration=119.760023449 podStartE2EDuration="1m59.760023449s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.73305448 +0000 UTC m=+140.768932460" watchObservedRunningTime="2026-01-25 05:40:49.760023449 +0000 UTC m=+140.795901428" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.760106 4728 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sw8t2 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]log ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]etcd ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/generic-apiserver-start-informers ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/max-in-flight-filter ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 25 05:40:49 crc kubenswrapper[4728]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 25 05:40:49 crc kubenswrapper[4728]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/project.openshift.io-projectcache ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-startinformers ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 25 05:40:49 crc kubenswrapper[4728]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 25 05:40:49 crc kubenswrapper[4728]: livez check failed Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.760156 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" podUID="1f291fdb-786d-4cd1-b61f-f54de47908ff" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.780714 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pzn76" podStartSLOduration=119.780698596 podStartE2EDuration="1m59.780698596s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.779704324 +0000 UTC m=+140.815582305" watchObservedRunningTime="2026-01-25 05:40:49.780698596 +0000 UTC m=+140.816576577" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.824080 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.826611 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.326599024 +0000 UTC m=+141.362477004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.850478 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:49 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:49 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:49 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.850513 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.859806 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lkwgp" podStartSLOduration=119.859789136 podStartE2EDuration="1m59.859789136s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.818177615 +0000 UTC m=+140.854055596" watchObservedRunningTime="2026-01-25 05:40:49.859789136 +0000 UTC m=+140.895667116" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.861907 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2d28d" podStartSLOduration=119.86189918 podStartE2EDuration="1m59.86189918s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.859472446 +0000 UTC m=+140.895350426" watchObservedRunningTime="2026-01-25 05:40:49.86189918 +0000 UTC m=+140.897777160" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.919732 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pvqpc" podStartSLOduration=7.919712742 podStartE2EDuration="7.919712742s" podCreationTimestamp="2026-01-25 05:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.908840386 +0000 UTC m=+140.944718366" watchObservedRunningTime="2026-01-25 05:40:49.919712742 +0000 UTC m=+140.955590721" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.929114 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:49 crc kubenswrapper[4728]: E0125 05:40:49.929423 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.429411997 +0000 UTC m=+141.465289977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.931052 4728 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.950656 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5l465" podStartSLOduration=118.950637687 podStartE2EDuration="1m58.950637687s" podCreationTimestamp="2026-01-25 05:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.948280396 +0000 UTC m=+140.984158376" watchObservedRunningTime="2026-01-25 05:40:49.950637687 +0000 UTC m=+140.986515668" Jan 25 05:40:49 crc kubenswrapper[4728]: I0125 05:40:49.977368 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" podStartSLOduration=119.977357986 podStartE2EDuration="1m59.977357986s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:49.976272702 +0000 UTC m=+141.012150681" watchObservedRunningTime="2026-01-25 05:40:49.977357986 +0000 UTC m=+141.013235966" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.031951 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.032514 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.532498069 +0000 UTC m=+141.568376049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.034309 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod322a41cf_af1e_4c7e_80e3_b7c7a32e8c68.slice/crio-c9f667c9c53b8d3ea2f9be2f47a6b9a0955d6c9ee583815ebe2bc20d8a86d114.scope\": RecentStats: unable to find data in memory cache]" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.133525 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.133802 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.633790996 +0000 UTC m=+141.669668977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.234982 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.235161 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.735138076 +0000 UTC m=+141.771016056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.235451 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.235754 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.735743773 +0000 UTC m=+141.771621753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.336978 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.337209 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.837171976 +0000 UTC m=+141.873049956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.337345 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.337737 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.83771819 +0000 UTC m=+141.873596170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.439017 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.439265 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.939214732 +0000 UTC m=+141.975092712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.439532 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:50 crc kubenswrapper[4728]: E0125 05:40:50.440048 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-25 05:40:50.940040044 +0000 UTC m=+141.975918024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-22q6g" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.465965 4728 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-25T05:40:49.931076218Z","Handler":null,"Name":""} Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.470220 4728 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.470256 4728 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.540555 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.545679 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.641908 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.644230 4728 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.644384 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.675620 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-22q6g\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.705550 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cns5h" podStartSLOduration=120.705519395 podStartE2EDuration="2m0.705519395s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:50.063349976 +0000 UTC m=+141.099227956" watchObservedRunningTime="2026-01-25 05:40:50.705519395 +0000 UTC m=+141.741397376" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.707378 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d7c9g"] Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.709508 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.711704 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.719482 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7c9g"] Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.743709 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" event={"ID":"ba22e26f-3207-4eea-83c9-cbe417c3a521","Type":"ContainerStarted","Data":"af729b6f7935f145a0531784908ddcbe277f847fcc2c7f48e40cff7f2af8f5d1"} Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.743755 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" event={"ID":"ba22e26f-3207-4eea-83c9-cbe417c3a521","Type":"ContainerStarted","Data":"27cc20c816f0654995e5578108fb6c0482df01835d4203f81bc7da40dc730ac9"} Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.745996 4728 generic.go:334] "Generic (PLEG): container finished" podID="322a41cf-af1e-4c7e-80e3-b7c7a32e8c68" containerID="c9f667c9c53b8d3ea2f9be2f47a6b9a0955d6c9ee583815ebe2bc20d8a86d114" exitCode=0 Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.746226 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" event={"ID":"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68","Type":"ContainerDied","Data":"c9f667c9c53b8d3ea2f9be2f47a6b9a0955d6c9ee583815ebe2bc20d8a86d114"} Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.747392 4728 patch_prober.go:28] interesting pod/downloads-7954f5f757-nlfgj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.747437 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nlfgj" podUID="be8794a5-e8f0-4216-ade3-6def48bd8859" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.750883 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.755463 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2jdpb" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.756058 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8blvp" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.761854 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hh2p9" podStartSLOduration=8.761842417 podStartE2EDuration="8.761842417s" podCreationTimestamp="2026-01-25 05:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:50.76051274 +0000 UTC m=+141.796390721" watchObservedRunningTime="2026-01-25 05:40:50.761842417 +0000 UTC m=+141.797720388" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.842884 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:50 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:50 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:50 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.843171 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.844494 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx2b9\" (UniqueName: \"kubernetes.io/projected/37f3e115-0b8f-473d-9cb3-dea0a2685889-kube-api-access-hx2b9\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.844678 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-utilities\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.844883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-catalog-content\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.917110 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.917535 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9n9h8"] Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.918979 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.928739 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.931990 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9n9h8"] Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.945829 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2b9\" (UniqueName: \"kubernetes.io/projected/37f3e115-0b8f-473d-9cb3-dea0a2685889-kube-api-access-hx2b9\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.945902 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-utilities\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.945961 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-catalog-content\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.946458 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-catalog-content\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.947027 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-utilities\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:50 crc kubenswrapper[4728]: I0125 05:40:50.967843 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx2b9\" (UniqueName: \"kubernetes.io/projected/37f3e115-0b8f-473d-9cb3-dea0a2685889-kube-api-access-hx2b9\") pod \"certified-operators-d7c9g\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.021826 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.048273 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-catalog-content\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.048555 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-utilities\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.048576 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv96k\" (UniqueName: \"kubernetes.io/projected/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-kube-api-access-wv96k\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.101916 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pjdnb"] Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.104153 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.115614 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pjdnb"] Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.123745 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22q6g"] Jan 25 05:40:51 crc kubenswrapper[4728]: W0125 05:40:51.127138 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc5ba2c4_eaeb_4620_b71b_7bb45ad640d1.slice/crio-3840de3c61125411d59ed4c25c5e9f57cd168bb37a3765266e68851002de2788 WatchSource:0}: Error finding container 3840de3c61125411d59ed4c25c5e9f57cd168bb37a3765266e68851002de2788: Status 404 returned error can't find the container with id 3840de3c61125411d59ed4c25c5e9f57cd168bb37a3765266e68851002de2788 Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.150103 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-utilities\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.150143 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv96k\" (UniqueName: \"kubernetes.io/projected/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-kube-api-access-wv96k\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.150212 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-catalog-content\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.150580 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-utilities\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.150599 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-catalog-content\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.167054 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv96k\" (UniqueName: \"kubernetes.io/projected/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-kube-api-access-wv96k\") pod \"community-operators-9n9h8\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.239836 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.251011 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdb9\" (UniqueName: \"kubernetes.io/projected/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-kube-api-access-pwdb9\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.251098 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-utilities\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.251144 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-catalog-content\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.307045 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9pnzz"] Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.307884 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.323649 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9pnzz"] Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.334412 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.354383 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdb9\" (UniqueName: \"kubernetes.io/projected/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-kube-api-access-pwdb9\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.354784 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-utilities\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.354807 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-catalog-content\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.355128 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-catalog-content\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.357057 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-utilities\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.371039 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdb9\" (UniqueName: \"kubernetes.io/projected/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-kube-api-access-pwdb9\") pod \"certified-operators-pjdnb\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.376838 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7c9g"] Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.415687 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k2b65" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.419896 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.455521 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-utilities\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.455592 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7dk\" (UniqueName: \"kubernetes.io/projected/662e56c9-81a2-457b-8448-8cea4a0005c2-kube-api-access-ss7dk\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.455636 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-catalog-content\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.556935 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-catalog-content\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.557046 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-utilities\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.557215 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7dk\" (UniqueName: \"kubernetes.io/projected/662e56c9-81a2-457b-8448-8cea4a0005c2-kube-api-access-ss7dk\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.557494 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-utilities\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.558011 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-catalog-content\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.564540 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pjdnb"] Jan 25 05:40:51 crc kubenswrapper[4728]: W0125 05:40:51.571957 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16045e91_fc2e_4fc5_8cf9_a40bae675e7d.slice/crio-905622d9836ddd3cb44e24f1ef12909961bd73c530fd8bbada41d6c203646418 WatchSource:0}: Error finding container 905622d9836ddd3cb44e24f1ef12909961bd73c530fd8bbada41d6c203646418: Status 404 returned error can't find the container with id 905622d9836ddd3cb44e24f1ef12909961bd73c530fd8bbada41d6c203646418 Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.574395 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7dk\" (UniqueName: \"kubernetes.io/projected/662e56c9-81a2-457b-8448-8cea4a0005c2-kube-api-access-ss7dk\") pod \"community-operators-9pnzz\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.611821 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9n9h8"] Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.624472 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:40:51 crc kubenswrapper[4728]: W0125 05:40:51.626306 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b46b12_a3f2_41fc_9d28_3c9a0dbc8b55.slice/crio-3c466404fd6f6b2ac8fbde73e9f68e1f32e535d55c183fc762f83210753a7e66 WatchSource:0}: Error finding container 3c466404fd6f6b2ac8fbde73e9f68e1f32e535d55c183fc762f83210753a7e66: Status 404 returned error can't find the container with id 3c466404fd6f6b2ac8fbde73e9f68e1f32e535d55c183fc762f83210753a7e66 Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.751618 4728 generic.go:334] "Generic (PLEG): container finished" podID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerID="6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057" exitCode=0 Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.751740 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7c9g" event={"ID":"37f3e115-0b8f-473d-9cb3-dea0a2685889","Type":"ContainerDied","Data":"6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057"} Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.751800 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7c9g" event={"ID":"37f3e115-0b8f-473d-9cb3-dea0a2685889","Type":"ContainerStarted","Data":"28303655ac5f500688365e9a862c5450b0cd778b52d97ae153a83d524a9055e1"} Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.753656 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.757072 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n9h8" event={"ID":"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55","Type":"ContainerStarted","Data":"bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0"} Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.757102 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n9h8" event={"ID":"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55","Type":"ContainerStarted","Data":"3c466404fd6f6b2ac8fbde73e9f68e1f32e535d55c183fc762f83210753a7e66"} Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.758658 4728 generic.go:334] "Generic (PLEG): container finished" podID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerID="86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f" exitCode=0 Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.758690 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjdnb" event={"ID":"16045e91-fc2e-4fc5-8cf9-a40bae675e7d","Type":"ContainerDied","Data":"86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f"} Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.758705 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjdnb" event={"ID":"16045e91-fc2e-4fc5-8cf9-a40bae675e7d","Type":"ContainerStarted","Data":"905622d9836ddd3cb44e24f1ef12909961bd73c530fd8bbada41d6c203646418"} Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.760274 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" event={"ID":"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1","Type":"ContainerStarted","Data":"11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5"} Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.760295 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" event={"ID":"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1","Type":"ContainerStarted","Data":"3840de3c61125411d59ed4c25c5e9f57cd168bb37a3765266e68851002de2788"} Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.802932 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" podStartSLOduration=121.80291338 podStartE2EDuration="2m1.80291338s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:40:51.786966051 +0000 UTC m=+142.822844051" watchObservedRunningTime="2026-01-25 05:40:51.80291338 +0000 UTC m=+142.838791360" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.823939 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9pnzz"] Jan 25 05:40:51 crc kubenswrapper[4728]: W0125 05:40:51.826075 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod662e56c9_81a2_457b_8448_8cea4a0005c2.slice/crio-c810d816565ab11f9523c9ec1da03191cfad6e499815205f33d8ff5706b8164c WatchSource:0}: Error finding container c810d816565ab11f9523c9ec1da03191cfad6e499815205f33d8ff5706b8164c: Status 404 returned error can't find the container with id c810d816565ab11f9523c9ec1da03191cfad6e499815205f33d8ff5706b8164c Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.857694 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:51 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:51 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:51 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.857767 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:51 crc kubenswrapper[4728]: I0125 05:40:51.996056 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.064836 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf8mb\" (UniqueName: \"kubernetes.io/projected/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-kube-api-access-zf8mb\") pod \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.064986 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-secret-volume\") pod \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.065026 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-config-volume\") pod \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\" (UID: \"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68\") " Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.067369 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-config-volume" (OuterVolumeSpecName: "config-volume") pod "322a41cf-af1e-4c7e-80e3-b7c7a32e8c68" (UID: "322a41cf-af1e-4c7e-80e3-b7c7a32e8c68"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.070274 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-kube-api-access-zf8mb" (OuterVolumeSpecName: "kube-api-access-zf8mb") pod "322a41cf-af1e-4c7e-80e3-b7c7a32e8c68" (UID: "322a41cf-af1e-4c7e-80e3-b7c7a32e8c68"). InnerVolumeSpecName "kube-api-access-zf8mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.084226 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "322a41cf-af1e-4c7e-80e3-b7c7a32e8c68" (UID: "322a41cf-af1e-4c7e-80e3-b7c7a32e8c68"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.168225 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.168283 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-config-volume\") on node \"crc\" DevicePath \"\"" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.168294 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf8mb\" (UniqueName: \"kubernetes.io/projected/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68-kube-api-access-zf8mb\") on node \"crc\" DevicePath \"\"" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.766781 4728 generic.go:334] "Generic (PLEG): container finished" podID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerID="bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0" exitCode=0 Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.766868 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n9h8" event={"ID":"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55","Type":"ContainerDied","Data":"bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0"} Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.769939 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" event={"ID":"322a41cf-af1e-4c7e-80e3-b7c7a32e8c68","Type":"ContainerDied","Data":"cc1f453e707064c313e5bec008cad4e2e41239e2dc506bc441d650951faa1acf"} Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.769991 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.770018 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc1f453e707064c313e5bec008cad4e2e41239e2dc506bc441d650951faa1acf" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.772345 4728 generic.go:334] "Generic (PLEG): container finished" podID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerID="aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0" exitCode=0 Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.772396 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pnzz" event={"ID":"662e56c9-81a2-457b-8448-8cea4a0005c2","Type":"ContainerDied","Data":"aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0"} Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.772440 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pnzz" event={"ID":"662e56c9-81a2-457b-8448-8cea4a0005c2","Type":"ContainerStarted","Data":"c810d816565ab11f9523c9ec1da03191cfad6e499815205f33d8ff5706b8164c"} Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.773867 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.842900 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:52 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:52 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:52 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.842973 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.908442 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-785sb"] Jan 25 05:40:52 crc kubenswrapper[4728]: E0125 05:40:52.908710 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322a41cf-af1e-4c7e-80e3-b7c7a32e8c68" containerName="collect-profiles" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.908723 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="322a41cf-af1e-4c7e-80e3-b7c7a32e8c68" containerName="collect-profiles" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.908844 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="322a41cf-af1e-4c7e-80e3-b7c7a32e8c68" containerName="collect-profiles" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.909860 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.912771 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.914451 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-785sb"] Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.978876 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-utilities\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.978926 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tv2k\" (UniqueName: \"kubernetes.io/projected/b59c4b5c-3733-4fed-8410-57526ce048b2-kube-api-access-4tv2k\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.979182 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-catalog-content\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.991661 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.992668 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.994561 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 25 05:40:52 crc kubenswrapper[4728]: I0125 05:40:52.994764 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.000112 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.080526 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b66e5e50-c238-476c-b592-afee310cbda7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b66e5e50-c238-476c-b592-afee310cbda7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.080676 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-utilities\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.080708 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tv2k\" (UniqueName: \"kubernetes.io/projected/b59c4b5c-3733-4fed-8410-57526ce048b2-kube-api-access-4tv2k\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.081238 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-utilities\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.081511 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b66e5e50-c238-476c-b592-afee310cbda7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b66e5e50-c238-476c-b592-afee310cbda7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.081631 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-catalog-content\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.081962 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-catalog-content\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.097999 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tv2k\" (UniqueName: \"kubernetes.io/projected/b59c4b5c-3733-4fed-8410-57526ce048b2-kube-api-access-4tv2k\") pod \"redhat-marketplace-785sb\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.183528 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b66e5e50-c238-476c-b592-afee310cbda7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b66e5e50-c238-476c-b592-afee310cbda7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.183628 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b66e5e50-c238-476c-b592-afee310cbda7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b66e5e50-c238-476c-b592-afee310cbda7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.183692 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b66e5e50-c238-476c-b592-afee310cbda7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b66e5e50-c238-476c-b592-afee310cbda7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.197857 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b66e5e50-c238-476c-b592-afee310cbda7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b66e5e50-c238-476c-b592-afee310cbda7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.224830 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.302098 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8mtcb"] Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.303264 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.310792 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.344013 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mtcb"] Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.386622 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-utilities\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.386878 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqtf7\" (UniqueName: \"kubernetes.io/projected/394acccd-e8b7-4180-b169-2bad0dd7f7ba-kube-api-access-pqtf7\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.386943 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-catalog-content\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.425448 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-785sb"] Jan 25 05:40:53 crc kubenswrapper[4728]: W0125 05:40:53.435821 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb59c4b5c_3733_4fed_8410_57526ce048b2.slice/crio-36d0dc238edf0160a93f37629638e24e6f772c8132e4b1b6c71f2234c6590885 WatchSource:0}: Error finding container 36d0dc238edf0160a93f37629638e24e6f772c8132e4b1b6c71f2234c6590885: Status 404 returned error can't find the container with id 36d0dc238edf0160a93f37629638e24e6f772c8132e4b1b6c71f2234c6590885 Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.489661 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-utilities\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.489775 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqtf7\" (UniqueName: \"kubernetes.io/projected/394acccd-e8b7-4180-b169-2bad0dd7f7ba-kube-api-access-pqtf7\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.489830 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-catalog-content\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.490731 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-catalog-content\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.490762 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-utilities\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.501023 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.508125 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqtf7\" (UniqueName: \"kubernetes.io/projected/394acccd-e8b7-4180-b169-2bad0dd7f7ba-kube-api-access-pqtf7\") pod \"redhat-marketplace-8mtcb\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.623275 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.780652 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b66e5e50-c238-476c-b592-afee310cbda7","Type":"ContainerStarted","Data":"5dadcd929e42823a20afb39a844178256bc48c920537e5ca1d7023112c6626f6"} Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.783756 4728 generic.go:334] "Generic (PLEG): container finished" podID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerID="d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b" exitCode=0 Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.784805 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-785sb" event={"ID":"b59c4b5c-3733-4fed-8410-57526ce048b2","Type":"ContainerDied","Data":"d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b"} Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.784822 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-785sb" event={"ID":"b59c4b5c-3733-4fed-8410-57526ce048b2","Type":"ContainerStarted","Data":"36d0dc238edf0160a93f37629638e24e6f772c8132e4b1b6c71f2234c6590885"} Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.845863 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:53 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:53 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:53 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.845893 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.872778 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mtcb"] Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.905115 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6zfs"] Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.906270 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.910078 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.911718 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6zfs"] Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.995578 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-utilities\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.995841 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x8b6\" (UniqueName: \"kubernetes.io/projected/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-kube-api-access-5x8b6\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:53 crc kubenswrapper[4728]: I0125 05:40:53.995901 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-catalog-content\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.101529 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x8b6\" (UniqueName: \"kubernetes.io/projected/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-kube-api-access-5x8b6\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.101597 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-catalog-content\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.101647 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-utilities\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.102021 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-utilities\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.102158 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-catalog-content\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.124799 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x8b6\" (UniqueName: \"kubernetes.io/projected/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-kube-api-access-5x8b6\") pod \"redhat-operators-p6zfs\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.240401 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.308512 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pdckc"] Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.309545 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.316789 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdckc"] Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.406979 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-catalog-content\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.407012 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbbb\" (UniqueName: \"kubernetes.io/projected/aa46f651-22c1-40df-9843-622d15eb26e7-kube-api-access-6nbbb\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.407057 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-utilities\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.508548 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-catalog-content\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.508597 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nbbb\" (UniqueName: \"kubernetes.io/projected/aa46f651-22c1-40df-9843-622d15eb26e7-kube-api-access-6nbbb\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.508640 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-utilities\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.509479 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-utilities\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.509893 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-catalog-content\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.524660 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nbbb\" (UniqueName: \"kubernetes.io/projected/aa46f651-22c1-40df-9843-622d15eb26e7-kube-api-access-6nbbb\") pod \"redhat-operators-pdckc\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.627909 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.677553 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6zfs"] Jan 25 05:40:54 crc kubenswrapper[4728]: W0125 05:40:54.693468 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a82d6e_706c_4b85_81ca_9bf8fb99d904.slice/crio-30ff0604d63af93c092c9be80c39e72bc29af6fa8afc1ec00571ee73283066b7 WatchSource:0}: Error finding container 30ff0604d63af93c092c9be80c39e72bc29af6fa8afc1ec00571ee73283066b7: Status 404 returned error can't find the container with id 30ff0604d63af93c092c9be80c39e72bc29af6fa8afc1ec00571ee73283066b7 Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.751072 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.755549 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-sw8t2" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.820639 4728 generic.go:334] "Generic (PLEG): container finished" podID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerID="f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97" exitCode=0 Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.820830 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mtcb" event={"ID":"394acccd-e8b7-4180-b169-2bad0dd7f7ba","Type":"ContainerDied","Data":"f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97"} Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.820937 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mtcb" event={"ID":"394acccd-e8b7-4180-b169-2bad0dd7f7ba","Type":"ContainerStarted","Data":"ac38515267e8d8578d086ad0d08b636c7c1817a124c41f2614dc37c9306394ab"} Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.826134 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6zfs" event={"ID":"d6a82d6e-706c-4b85-81ca-9bf8fb99d904","Type":"ContainerStarted","Data":"30ff0604d63af93c092c9be80c39e72bc29af6fa8afc1ec00571ee73283066b7"} Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.844250 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:54 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:54 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:54 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.844290 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.844785 4728 generic.go:334] "Generic (PLEG): container finished" podID="b66e5e50-c238-476c-b592-afee310cbda7" containerID="baaacfba0457395d7fc96270bc14ad288f54a5035141e13acef17d95c671341f" exitCode=0 Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.845453 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b66e5e50-c238-476c-b592-afee310cbda7","Type":"ContainerDied","Data":"baaacfba0457395d7fc96270bc14ad288f54a5035141e13acef17d95c671341f"} Jan 25 05:40:54 crc kubenswrapper[4728]: I0125 05:40:54.873198 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdckc"] Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.319795 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.319860 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.319888 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.319908 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.321528 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.324919 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.325028 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.325518 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.339230 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.435445 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nlfgj" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.437056 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.441893 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 25 05:40:55 crc kubenswrapper[4728]: W0125 05:40:55.544167 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-039dd9dd4939a68e56a81149e6b2fa1ee58de60c0da2f7e05f2ff46a2c310181 WatchSource:0}: Error finding container 039dd9dd4939a68e56a81149e6b2fa1ee58de60c0da2f7e05f2ff46a2c310181: Status 404 returned error can't find the container with id 039dd9dd4939a68e56a81149e6b2fa1ee58de60c0da2f7e05f2ff46a2c310181 Jan 25 05:40:55 crc kubenswrapper[4728]: W0125 05:40:55.671487 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-b44766cb3b76487f70b626af573870d54782300804172867adc9e8a1adfcfd0e WatchSource:0}: Error finding container b44766cb3b76487f70b626af573870d54782300804172867adc9e8a1adfcfd0e: Status 404 returned error can't find the container with id b44766cb3b76487f70b626af573870d54782300804172867adc9e8a1adfcfd0e Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.719476 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.719527 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.724744 4728 patch_prober.go:28] interesting pod/console-f9d7485db-ntzq4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.724817 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ntzq4" podUID="607307b0-b3c9-4a00-9347-a299a689c1c8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.840083 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.844648 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:55 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:55 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:55 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.844680 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.854013 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerID="4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d" exitCode=0 Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.854065 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6zfs" event={"ID":"d6a82d6e-706c-4b85-81ca-9bf8fb99d904","Type":"ContainerDied","Data":"4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d"} Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.855587 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"332769b27502a9a24325d82e100e3c72e6130b929d5595441112a58bfbbd230c"} Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.855616 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"039dd9dd4939a68e56a81149e6b2fa1ee58de60c0da2f7e05f2ff46a2c310181"} Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.855803 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.859761 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b44766cb3b76487f70b626af573870d54782300804172867adc9e8a1adfcfd0e"} Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.862713 4728 generic.go:334] "Generic (PLEG): container finished" podID="aa46f651-22c1-40df-9843-622d15eb26e7" containerID="df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b" exitCode=0 Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.862817 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdckc" event={"ID":"aa46f651-22c1-40df-9843-622d15eb26e7","Type":"ContainerDied","Data":"df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b"} Jan 25 05:40:55 crc kubenswrapper[4728]: I0125 05:40:55.862847 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdckc" event={"ID":"aa46f651-22c1-40df-9843-622d15eb26e7","Type":"ContainerStarted","Data":"8aae1073866590482452bb019c89e6f2fd1a3cfd12143193bc0e6ee12c0fb6e7"} Jan 25 05:40:56 crc kubenswrapper[4728]: I0125 05:40:56.842529 4728 patch_prober.go:28] interesting pod/router-default-5444994796-75jc5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 25 05:40:56 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Jan 25 05:40:56 crc kubenswrapper[4728]: [+]process-running ok Jan 25 05:40:56 crc kubenswrapper[4728]: healthz check failed Jan 25 05:40:56 crc kubenswrapper[4728]: I0125 05:40:56.842803 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-75jc5" podUID="b4b43f81-6cbe-45f7-95d8-cd60fb107efd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 25 05:40:56 crc kubenswrapper[4728]: I0125 05:40:56.873960 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"244b4e478758fc045ad546801197fc48633d819019c3da639a18b38682cfdbb6"} Jan 25 05:40:57 crc kubenswrapper[4728]: I0125 05:40:57.842846 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:57 crc kubenswrapper[4728]: I0125 05:40:57.845799 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-75jc5" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.083743 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.084727 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.087975 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.088219 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.092017 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.170851 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5895593f-451c-4528-965d-6696bee97a45-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5895593f-451c-4528-965d-6696bee97a45\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.170927 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5895593f-451c-4528-965d-6696bee97a45-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5895593f-451c-4528-965d-6696bee97a45\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.271977 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5895593f-451c-4528-965d-6696bee97a45-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5895593f-451c-4528-965d-6696bee97a45\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.272085 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5895593f-451c-4528-965d-6696bee97a45-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5895593f-451c-4528-965d-6696bee97a45\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.272150 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5895593f-451c-4528-965d-6696bee97a45-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5895593f-451c-4528-965d-6696bee97a45\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.316034 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5895593f-451c-4528-965d-6696bee97a45-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5895593f-451c-4528-965d-6696bee97a45\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:40:58 crc kubenswrapper[4728]: I0125 05:40:58.402658 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.743842 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.801857 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b66e5e50-c238-476c-b592-afee310cbda7-kube-api-access\") pod \"b66e5e50-c238-476c-b592-afee310cbda7\" (UID: \"b66e5e50-c238-476c-b592-afee310cbda7\") " Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.802176 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b66e5e50-c238-476c-b592-afee310cbda7-kubelet-dir\") pod \"b66e5e50-c238-476c-b592-afee310cbda7\" (UID: \"b66e5e50-c238-476c-b592-afee310cbda7\") " Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.802282 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b66e5e50-c238-476c-b592-afee310cbda7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b66e5e50-c238-476c-b592-afee310cbda7" (UID: "b66e5e50-c238-476c-b592-afee310cbda7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.802524 4728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b66e5e50-c238-476c-b592-afee310cbda7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.806332 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66e5e50-c238-476c-b592-afee310cbda7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b66e5e50-c238-476c-b592-afee310cbda7" (UID: "b66e5e50-c238-476c-b592-afee310cbda7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.904114 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b66e5e50-c238-476c-b592-afee310cbda7-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.920518 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b66e5e50-c238-476c-b592-afee310cbda7","Type":"ContainerDied","Data":"5dadcd929e42823a20afb39a844178256bc48c920537e5ca1d7023112c6626f6"} Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.920545 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.920556 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dadcd929e42823a20afb39a844178256bc48c920537e5ca1d7023112c6626f6" Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.922298 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"81a54d2981ff90fcc0a5ed6990cb0c9280029ebdc5b277c0cb49754ce5cc2541"} Jan 25 05:40:59 crc kubenswrapper[4728]: I0125 05:40:59.922370 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"488d482c778993345ee6d4a7cc4c9634785578ec00d751c676fe98f8550f6d26"} Jan 25 05:41:00 crc kubenswrapper[4728]: I0125 05:41:00.087739 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 25 05:41:00 crc kubenswrapper[4728]: W0125 05:41:00.102311 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5895593f_451c_4528_965d_6696bee97a45.slice/crio-972ff5b8cce83e1ace77f5ec2314b729befee12b3fbc319dd34f6f8e3e3c1c13 WatchSource:0}: Error finding container 972ff5b8cce83e1ace77f5ec2314b729befee12b3fbc319dd34f6f8e3e3c1c13: Status 404 returned error can't find the container with id 972ff5b8cce83e1ace77f5ec2314b729befee12b3fbc319dd34f6f8e3e3c1c13 Jan 25 05:41:00 crc kubenswrapper[4728]: I0125 05:41:00.938624 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5895593f-451c-4528-965d-6696bee97a45","Type":"ContainerStarted","Data":"7c2ae003206b8a59137d9456f9045069e68cc869e542d0e2f5960bc69323f9b4"} Jan 25 05:41:00 crc kubenswrapper[4728]: I0125 05:41:00.938955 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5895593f-451c-4528-965d-6696bee97a45","Type":"ContainerStarted","Data":"972ff5b8cce83e1ace77f5ec2314b729befee12b3fbc319dd34f6f8e3e3c1c13"} Jan 25 05:41:00 crc kubenswrapper[4728]: I0125 05:41:00.956145 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pvqpc" Jan 25 05:41:00 crc kubenswrapper[4728]: I0125 05:41:00.968251 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.9682383249999997 podStartE2EDuration="2.968238325s" podCreationTimestamp="2026-01-25 05:40:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:41:00.952242835 +0000 UTC m=+151.988120814" watchObservedRunningTime="2026-01-25 05:41:00.968238325 +0000 UTC m=+152.004116306" Jan 25 05:41:01 crc kubenswrapper[4728]: I0125 05:41:01.954660 4728 generic.go:334] "Generic (PLEG): container finished" podID="5895593f-451c-4528-965d-6696bee97a45" containerID="7c2ae003206b8a59137d9456f9045069e68cc869e542d0e2f5960bc69323f9b4" exitCode=0 Jan 25 05:41:01 crc kubenswrapper[4728]: I0125 05:41:01.954920 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5895593f-451c-4528-965d-6696bee97a45","Type":"ContainerDied","Data":"7c2ae003206b8a59137d9456f9045069e68cc869e542d0e2f5960bc69323f9b4"} Jan 25 05:41:05 crc kubenswrapper[4728]: I0125 05:41:05.724251 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:41:05 crc kubenswrapper[4728]: I0125 05:41:05.727605 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:41:09 crc kubenswrapper[4728]: I0125 05:41:09.059111 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:41:09 crc kubenswrapper[4728]: I0125 05:41:09.129978 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5895593f-451c-4528-965d-6696bee97a45-kubelet-dir\") pod \"5895593f-451c-4528-965d-6696bee97a45\" (UID: \"5895593f-451c-4528-965d-6696bee97a45\") " Jan 25 05:41:09 crc kubenswrapper[4728]: I0125 05:41:09.130284 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5895593f-451c-4528-965d-6696bee97a45-kube-api-access\") pod \"5895593f-451c-4528-965d-6696bee97a45\" (UID: \"5895593f-451c-4528-965d-6696bee97a45\") " Jan 25 05:41:09 crc kubenswrapper[4728]: I0125 05:41:09.130115 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5895593f-451c-4528-965d-6696bee97a45-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5895593f-451c-4528-965d-6696bee97a45" (UID: "5895593f-451c-4528-965d-6696bee97a45"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:41:09 crc kubenswrapper[4728]: I0125 05:41:09.130592 4728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5895593f-451c-4528-965d-6696bee97a45-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:09 crc kubenswrapper[4728]: I0125 05:41:09.140701 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5895593f-451c-4528-965d-6696bee97a45-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5895593f-451c-4528-965d-6696bee97a45" (UID: "5895593f-451c-4528-965d-6696bee97a45"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:41:09 crc kubenswrapper[4728]: I0125 05:41:09.231653 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5895593f-451c-4528-965d-6696bee97a45-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:10 crc kubenswrapper[4728]: I0125 05:41:10.006055 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5895593f-451c-4528-965d-6696bee97a45","Type":"ContainerDied","Data":"972ff5b8cce83e1ace77f5ec2314b729befee12b3fbc319dd34f6f8e3e3c1c13"} Jan 25 05:41:10 crc kubenswrapper[4728]: I0125 05:41:10.006099 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="972ff5b8cce83e1ace77f5ec2314b729befee12b3fbc319dd34f6f8e3e3c1c13" Jan 25 05:41:10 crc kubenswrapper[4728]: I0125 05:41:10.006331 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 25 05:41:10 crc kubenswrapper[4728]: I0125 05:41:10.926168 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:41:12 crc kubenswrapper[4728]: I0125 05:41:12.373665 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:41:12 crc kubenswrapper[4728]: I0125 05:41:12.383514 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/accc0eb5-6067-4ab9-bbab-6d2ae898942f-metrics-certs\") pod \"network-metrics-daemon-k5pj4\" (UID: \"accc0eb5-6067-4ab9-bbab-6d2ae898942f\") " pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:41:12 crc kubenswrapper[4728]: I0125 05:41:12.443748 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k5pj4" Jan 25 05:41:12 crc kubenswrapper[4728]: I0125 05:41:12.854953 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k5pj4"] Jan 25 05:41:12 crc kubenswrapper[4728]: I0125 05:41:12.899210 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:41:12 crc kubenswrapper[4728]: I0125 05:41:12.899261 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.022353 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdckc" event={"ID":"aa46f651-22c1-40df-9843-622d15eb26e7","Type":"ContainerStarted","Data":"e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8"} Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.024118 4728 generic.go:334] "Generic (PLEG): container finished" podID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerID="f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab" exitCode=0 Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.024181 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjdnb" event={"ID":"16045e91-fc2e-4fc5-8cf9-a40bae675e7d","Type":"ContainerDied","Data":"f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab"} Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.027518 4728 generic.go:334] "Generic (PLEG): container finished" podID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerID="1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539" exitCode=0 Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.027611 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pnzz" event={"ID":"662e56c9-81a2-457b-8448-8cea4a0005c2","Type":"ContainerDied","Data":"1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539"} Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.029537 4728 generic.go:334] "Generic (PLEG): container finished" podID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerID="a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4" exitCode=0 Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.029607 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mtcb" event={"ID":"394acccd-e8b7-4180-b169-2bad0dd7f7ba","Type":"ContainerDied","Data":"a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4"} Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.036396 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6zfs" event={"ID":"d6a82d6e-706c-4b85-81ca-9bf8fb99d904","Type":"ContainerStarted","Data":"ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812"} Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.037835 4728 generic.go:334] "Generic (PLEG): container finished" podID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerID="208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4" exitCode=0 Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.037903 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7c9g" event={"ID":"37f3e115-0b8f-473d-9cb3-dea0a2685889","Type":"ContainerDied","Data":"208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4"} Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.045523 4728 generic.go:334] "Generic (PLEG): container finished" podID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerID="952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c" exitCode=0 Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.045621 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n9h8" event={"ID":"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55","Type":"ContainerDied","Data":"952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c"} Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.051358 4728 generic.go:334] "Generic (PLEG): container finished" podID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerID="acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474" exitCode=0 Jan 25 05:41:13 crc kubenswrapper[4728]: I0125 05:41:13.051395 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-785sb" event={"ID":"b59c4b5c-3733-4fed-8410-57526ce048b2","Type":"ContainerDied","Data":"acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.059365 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerID="ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812" exitCode=0 Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.059445 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6zfs" event={"ID":"d6a82d6e-706c-4b85-81ca-9bf8fb99d904","Type":"ContainerDied","Data":"ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.062706 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7c9g" event={"ID":"37f3e115-0b8f-473d-9cb3-dea0a2685889","Type":"ContainerStarted","Data":"9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.071775 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n9h8" event={"ID":"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55","Type":"ContainerStarted","Data":"1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.074483 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pnzz" event={"ID":"662e56c9-81a2-457b-8448-8cea4a0005c2","Type":"ContainerStarted","Data":"eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.077713 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mtcb" event={"ID":"394acccd-e8b7-4180-b169-2bad0dd7f7ba","Type":"ContainerStarted","Data":"654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.082046 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" event={"ID":"accc0eb5-6067-4ab9-bbab-6d2ae898942f","Type":"ContainerStarted","Data":"23e4d7b8772e64a1992bfdf8c49f2e176bd5c29607744837f045213a4d602b81"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.082151 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" event={"ID":"accc0eb5-6067-4ab9-bbab-6d2ae898942f","Type":"ContainerStarted","Data":"fdc857c9375003f662d0cc20718688079dfa394d59f6622764c4720c15e6efe5"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.082171 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k5pj4" event={"ID":"accc0eb5-6067-4ab9-bbab-6d2ae898942f","Type":"ContainerStarted","Data":"dec02ebd4b48767d00b30cd8dc006bef0032346d5ad557bbebaf646cb78d229b"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.086940 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-785sb" event={"ID":"b59c4b5c-3733-4fed-8410-57526ce048b2","Type":"ContainerStarted","Data":"05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.109071 4728 generic.go:334] "Generic (PLEG): container finished" podID="aa46f651-22c1-40df-9843-622d15eb26e7" containerID="e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8" exitCode=0 Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.109153 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdckc" event={"ID":"aa46f651-22c1-40df-9843-622d15eb26e7","Type":"ContainerDied","Data":"e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.120224 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9n9h8" podStartSLOduration=3.377902758 podStartE2EDuration="24.120203409s" podCreationTimestamp="2026-01-25 05:40:50 +0000 UTC" firstStartedPulling="2026-01-25 05:40:52.769574874 +0000 UTC m=+143.805452854" lastFinishedPulling="2026-01-25 05:41:13.511875526 +0000 UTC m=+164.547753505" observedRunningTime="2026-01-25 05:41:14.11873906 +0000 UTC m=+165.154617040" watchObservedRunningTime="2026-01-25 05:41:14.120203409 +0000 UTC m=+165.156081389" Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.124374 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjdnb" event={"ID":"16045e91-fc2e-4fc5-8cf9-a40bae675e7d","Type":"ContainerStarted","Data":"b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e"} Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.138367 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d7c9g" podStartSLOduration=2.337109361 podStartE2EDuration="24.138347117s" podCreationTimestamp="2026-01-25 05:40:50 +0000 UTC" firstStartedPulling="2026-01-25 05:40:51.753346585 +0000 UTC m=+142.789224565" lastFinishedPulling="2026-01-25 05:41:13.554584342 +0000 UTC m=+164.590462321" observedRunningTime="2026-01-25 05:41:14.133843337 +0000 UTC m=+165.169721318" watchObservedRunningTime="2026-01-25 05:41:14.138347117 +0000 UTC m=+165.174225096" Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.158905 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9pnzz" podStartSLOduration=2.398102372 podStartE2EDuration="23.158896487s" podCreationTimestamp="2026-01-25 05:40:51 +0000 UTC" firstStartedPulling="2026-01-25 05:40:52.773671838 +0000 UTC m=+143.809549808" lastFinishedPulling="2026-01-25 05:41:13.534465954 +0000 UTC m=+164.570343923" observedRunningTime="2026-01-25 05:41:14.158667706 +0000 UTC m=+165.194545686" watchObservedRunningTime="2026-01-25 05:41:14.158896487 +0000 UTC m=+165.194774457" Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.177971 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8mtcb" podStartSLOduration=2.509446637 podStartE2EDuration="21.177959837s" podCreationTimestamp="2026-01-25 05:40:53 +0000 UTC" firstStartedPulling="2026-01-25 05:40:54.840297349 +0000 UTC m=+145.876175329" lastFinishedPulling="2026-01-25 05:41:13.50881055 +0000 UTC m=+164.544688529" observedRunningTime="2026-01-25 05:41:14.173033992 +0000 UTC m=+165.208911973" watchObservedRunningTime="2026-01-25 05:41:14.177959837 +0000 UTC m=+165.213837817" Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.209616 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pjdnb" podStartSLOduration=1.45274328 podStartE2EDuration="23.20959957s" podCreationTimestamp="2026-01-25 05:40:51 +0000 UTC" firstStartedPulling="2026-01-25 05:40:51.760925117 +0000 UTC m=+142.796803097" lastFinishedPulling="2026-01-25 05:41:13.517781407 +0000 UTC m=+164.553659387" observedRunningTime="2026-01-25 05:41:14.20768405 +0000 UTC m=+165.243562030" watchObservedRunningTime="2026-01-25 05:41:14.20959957 +0000 UTC m=+165.245477550" Jan 25 05:41:14 crc kubenswrapper[4728]: I0125 05:41:14.228995 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-785sb" podStartSLOduration=2.42244645 podStartE2EDuration="22.228973956s" podCreationTimestamp="2026-01-25 05:40:52 +0000 UTC" firstStartedPulling="2026-01-25 05:40:53.786603577 +0000 UTC m=+144.822481557" lastFinishedPulling="2026-01-25 05:41:13.593131083 +0000 UTC m=+164.629009063" observedRunningTime="2026-01-25 05:41:14.228868798 +0000 UTC m=+165.264746768" watchObservedRunningTime="2026-01-25 05:41:14.228973956 +0000 UTC m=+165.264851936" Jan 25 05:41:15 crc kubenswrapper[4728]: I0125 05:41:15.142227 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6zfs" event={"ID":"d6a82d6e-706c-4b85-81ca-9bf8fb99d904","Type":"ContainerStarted","Data":"5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff"} Jan 25 05:41:15 crc kubenswrapper[4728]: I0125 05:41:15.144569 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdckc" event={"ID":"aa46f651-22c1-40df-9843-622d15eb26e7","Type":"ContainerStarted","Data":"97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d"} Jan 25 05:41:15 crc kubenswrapper[4728]: I0125 05:41:15.161691 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k5pj4" podStartSLOduration=145.161674586 podStartE2EDuration="2m25.161674586s" podCreationTimestamp="2026-01-25 05:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:41:14.24240468 +0000 UTC m=+165.278282659" watchObservedRunningTime="2026-01-25 05:41:15.161674586 +0000 UTC m=+166.197552566" Jan 25 05:41:15 crc kubenswrapper[4728]: I0125 05:41:15.162079 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6zfs" podStartSLOduration=6.959547018 podStartE2EDuration="22.162075011s" podCreationTimestamp="2026-01-25 05:40:53 +0000 UTC" firstStartedPulling="2026-01-25 05:40:59.317391628 +0000 UTC m=+150.353269607" lastFinishedPulling="2026-01-25 05:41:14.51991962 +0000 UTC m=+165.555797600" observedRunningTime="2026-01-25 05:41:15.158937056 +0000 UTC m=+166.194815036" watchObservedRunningTime="2026-01-25 05:41:15.162075011 +0000 UTC m=+166.197952980" Jan 25 05:41:15 crc kubenswrapper[4728]: I0125 05:41:15.174219 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pdckc" podStartSLOduration=5.9119509709999996 podStartE2EDuration="21.174204562s" podCreationTimestamp="2026-01-25 05:40:54 +0000 UTC" firstStartedPulling="2026-01-25 05:40:59.31751963 +0000 UTC m=+150.353397609" lastFinishedPulling="2026-01-25 05:41:14.57977322 +0000 UTC m=+165.615651200" observedRunningTime="2026-01-25 05:41:15.17130681 +0000 UTC m=+166.207184790" watchObservedRunningTime="2026-01-25 05:41:15.174204562 +0000 UTC m=+166.210082542" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.022085 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.022399 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.095237 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.204565 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.240428 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.240477 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.273890 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.419961 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.420261 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.447977 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.625615 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.625729 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:41:21 crc kubenswrapper[4728]: I0125 05:41:21.654900 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:41:22 crc kubenswrapper[4728]: I0125 05:41:22.209525 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:41:22 crc kubenswrapper[4728]: I0125 05:41:22.211369 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:41:22 crc kubenswrapper[4728]: I0125 05:41:22.212463 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:41:23 crc kubenswrapper[4728]: I0125 05:41:23.048755 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pjdnb"] Jan 25 05:41:23 crc kubenswrapper[4728]: I0125 05:41:23.225164 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:41:23 crc kubenswrapper[4728]: I0125 05:41:23.225206 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:41:23 crc kubenswrapper[4728]: I0125 05:41:23.253032 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:41:23 crc kubenswrapper[4728]: I0125 05:41:23.624362 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:41:23 crc kubenswrapper[4728]: I0125 05:41:23.624577 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:41:23 crc kubenswrapper[4728]: I0125 05:41:23.647788 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9pnzz"] Jan 25 05:41:23 crc kubenswrapper[4728]: I0125 05:41:23.655726 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.189110 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pjdnb" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerName="registry-server" containerID="cri-o://b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e" gracePeriod=2 Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.217592 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.218496 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.241722 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.241757 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.271198 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.554264 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.620984 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwdb9\" (UniqueName: \"kubernetes.io/projected/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-kube-api-access-pwdb9\") pod \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.621032 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-catalog-content\") pod \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.621166 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-utilities\") pod \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\" (UID: \"16045e91-fc2e-4fc5-8cf9-a40bae675e7d\") " Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.621732 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-utilities" (OuterVolumeSpecName: "utilities") pod "16045e91-fc2e-4fc5-8cf9-a40bae675e7d" (UID: "16045e91-fc2e-4fc5-8cf9-a40bae675e7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.625961 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-kube-api-access-pwdb9" (OuterVolumeSpecName: "kube-api-access-pwdb9") pod "16045e91-fc2e-4fc5-8cf9-a40bae675e7d" (UID: "16045e91-fc2e-4fc5-8cf9-a40bae675e7d"). InnerVolumeSpecName "kube-api-access-pwdb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.628242 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.628270 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.656119 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16045e91-fc2e-4fc5-8cf9-a40bae675e7d" (UID: "16045e91-fc2e-4fc5-8cf9-a40bae675e7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.657584 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.722837 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.722857 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwdb9\" (UniqueName: \"kubernetes.io/projected/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-kube-api-access-pwdb9\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:24 crc kubenswrapper[4728]: I0125 05:41:24.722867 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16045e91-fc2e-4fc5-8cf9-a40bae675e7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.195963 4728 generic.go:334] "Generic (PLEG): container finished" podID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerID="b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e" exitCode=0 Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.196005 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjdnb" event={"ID":"16045e91-fc2e-4fc5-8cf9-a40bae675e7d","Type":"ContainerDied","Data":"b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e"} Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.196046 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjdnb" event={"ID":"16045e91-fc2e-4fc5-8cf9-a40bae675e7d","Type":"ContainerDied","Data":"905622d9836ddd3cb44e24f1ef12909961bd73c530fd8bbada41d6c203646418"} Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.196067 4728 scope.go:117] "RemoveContainer" containerID="b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.196015 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjdnb" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.196414 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9pnzz" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerName="registry-server" containerID="cri-o://eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04" gracePeriod=2 Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.211129 4728 scope.go:117] "RemoveContainer" containerID="f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.232054 4728 scope.go:117] "RemoveContainer" containerID="86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.234941 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pjdnb"] Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.235060 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.236517 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.237479 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pjdnb"] Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.295949 4728 scope.go:117] "RemoveContainer" containerID="b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e" Jan 25 05:41:25 crc kubenswrapper[4728]: E0125 05:41:25.296256 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e\": container with ID starting with b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e not found: ID does not exist" containerID="b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.296286 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e"} err="failed to get container status \"b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e\": rpc error: code = NotFound desc = could not find container \"b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e\": container with ID starting with b6c533b44964e34498739fca63e6ba38ddf61a78bb038bc820b9b77782339e4e not found: ID does not exist" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.296333 4728 scope.go:117] "RemoveContainer" containerID="f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab" Jan 25 05:41:25 crc kubenswrapper[4728]: E0125 05:41:25.296533 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab\": container with ID starting with f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab not found: ID does not exist" containerID="f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.296551 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab"} err="failed to get container status \"f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab\": rpc error: code = NotFound desc = could not find container \"f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab\": container with ID starting with f4d3d859755a378ebacdacd75f6a9a2028b939c55e965dd9c0b2039b46ddc8ab not found: ID does not exist" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.296565 4728 scope.go:117] "RemoveContainer" containerID="86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f" Jan 25 05:41:25 crc kubenswrapper[4728]: E0125 05:41:25.296766 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f\": container with ID starting with 86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f not found: ID does not exist" containerID="86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.296787 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f"} err="failed to get container status \"86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f\": rpc error: code = NotFound desc = could not find container \"86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f\": container with ID starting with 86afd21122bb919e1ab5488dcd22389c461aa434f330bf57b5719bc890da9c2f not found: ID does not exist" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.334380 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" path="/var/lib/kubelet/pods/16045e91-fc2e-4fc5-8cf9-a40bae675e7d/volumes" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.345979 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.592356 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.595976 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vp6n" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.738154 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-catalog-content\") pod \"662e56c9-81a2-457b-8448-8cea4a0005c2\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.738306 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss7dk\" (UniqueName: \"kubernetes.io/projected/662e56c9-81a2-457b-8448-8cea4a0005c2-kube-api-access-ss7dk\") pod \"662e56c9-81a2-457b-8448-8cea4a0005c2\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.738444 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-utilities\") pod \"662e56c9-81a2-457b-8448-8cea4a0005c2\" (UID: \"662e56c9-81a2-457b-8448-8cea4a0005c2\") " Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.738966 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-utilities" (OuterVolumeSpecName: "utilities") pod "662e56c9-81a2-457b-8448-8cea4a0005c2" (UID: "662e56c9-81a2-457b-8448-8cea4a0005c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.741584 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/662e56c9-81a2-457b-8448-8cea4a0005c2-kube-api-access-ss7dk" (OuterVolumeSpecName: "kube-api-access-ss7dk") pod "662e56c9-81a2-457b-8448-8cea4a0005c2" (UID: "662e56c9-81a2-457b-8448-8cea4a0005c2"). InnerVolumeSpecName "kube-api-access-ss7dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.779226 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "662e56c9-81a2-457b-8448-8cea4a0005c2" (UID: "662e56c9-81a2-457b-8448-8cea4a0005c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.840057 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss7dk\" (UniqueName: \"kubernetes.io/projected/662e56c9-81a2-457b-8448-8cea4a0005c2-kube-api-access-ss7dk\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.840083 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:25 crc kubenswrapper[4728]: I0125 05:41:25.840097 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/662e56c9-81a2-457b-8448-8cea4a0005c2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.048930 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mtcb"] Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.203382 4728 generic.go:334] "Generic (PLEG): container finished" podID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerID="eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04" exitCode=0 Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.203496 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pnzz" event={"ID":"662e56c9-81a2-457b-8448-8cea4a0005c2","Type":"ContainerDied","Data":"eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04"} Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.203554 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pnzz" event={"ID":"662e56c9-81a2-457b-8448-8cea4a0005c2","Type":"ContainerDied","Data":"c810d816565ab11f9523c9ec1da03191cfad6e499815205f33d8ff5706b8164c"} Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.203563 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9pnzz" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.203578 4728 scope.go:117] "RemoveContainer" containerID="eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.216844 4728 scope.go:117] "RemoveContainer" containerID="1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.226882 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9pnzz"] Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.230286 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9pnzz"] Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.243667 4728 scope.go:117] "RemoveContainer" containerID="aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.254206 4728 scope.go:117] "RemoveContainer" containerID="eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04" Jan 25 05:41:26 crc kubenswrapper[4728]: E0125 05:41:26.254595 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04\": container with ID starting with eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04 not found: ID does not exist" containerID="eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.254629 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04"} err="failed to get container status \"eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04\": rpc error: code = NotFound desc = could not find container \"eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04\": container with ID starting with eefaad8f2c618c667a0ebb7033d456068e43772fb7458badfc88760293c90f04 not found: ID does not exist" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.254653 4728 scope.go:117] "RemoveContainer" containerID="1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539" Jan 25 05:41:26 crc kubenswrapper[4728]: E0125 05:41:26.255039 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539\": container with ID starting with 1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539 not found: ID does not exist" containerID="1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.255115 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539"} err="failed to get container status \"1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539\": rpc error: code = NotFound desc = could not find container \"1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539\": container with ID starting with 1c7803039584065674f8ed91e72bf73684b137d35382049625a7c88c32515539 not found: ID does not exist" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.255167 4728 scope.go:117] "RemoveContainer" containerID="aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0" Jan 25 05:41:26 crc kubenswrapper[4728]: E0125 05:41:26.255530 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0\": container with ID starting with aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0 not found: ID does not exist" containerID="aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0" Jan 25 05:41:26 crc kubenswrapper[4728]: I0125 05:41:26.255632 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0"} err="failed to get container status \"aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0\": rpc error: code = NotFound desc = could not find container \"aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0\": container with ID starting with aac65f14fdd05cdd54ae3b654fca1fb88268795fda7f8ce27193738e69a834a0 not found: ID does not exist" Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.209278 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8mtcb" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerName="registry-server" containerID="cri-o://654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1" gracePeriod=2 Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.336286 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" path="/var/lib/kubelet/pods/662e56c9-81a2-457b-8448-8cea4a0005c2/volumes" Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.641416 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.761773 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-utilities\") pod \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.761874 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-catalog-content\") pod \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.761917 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqtf7\" (UniqueName: \"kubernetes.io/projected/394acccd-e8b7-4180-b169-2bad0dd7f7ba-kube-api-access-pqtf7\") pod \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\" (UID: \"394acccd-e8b7-4180-b169-2bad0dd7f7ba\") " Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.762964 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-utilities" (OuterVolumeSpecName: "utilities") pod "394acccd-e8b7-4180-b169-2bad0dd7f7ba" (UID: "394acccd-e8b7-4180-b169-2bad0dd7f7ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.766426 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394acccd-e8b7-4180-b169-2bad0dd7f7ba-kube-api-access-pqtf7" (OuterVolumeSpecName: "kube-api-access-pqtf7") pod "394acccd-e8b7-4180-b169-2bad0dd7f7ba" (UID: "394acccd-e8b7-4180-b169-2bad0dd7f7ba"). InnerVolumeSpecName "kube-api-access-pqtf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.781159 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "394acccd-e8b7-4180-b169-2bad0dd7f7ba" (UID: "394acccd-e8b7-4180-b169-2bad0dd7f7ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.863932 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.863966 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqtf7\" (UniqueName: \"kubernetes.io/projected/394acccd-e8b7-4180-b169-2bad0dd7f7ba-kube-api-access-pqtf7\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:27 crc kubenswrapper[4728]: I0125 05:41:27.863980 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/394acccd-e8b7-4180-b169-2bad0dd7f7ba-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.215788 4728 generic.go:334] "Generic (PLEG): container finished" podID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerID="654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1" exitCode=0 Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.215835 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mtcb" event={"ID":"394acccd-e8b7-4180-b169-2bad0dd7f7ba","Type":"ContainerDied","Data":"654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1"} Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.215854 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mtcb" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.215885 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mtcb" event={"ID":"394acccd-e8b7-4180-b169-2bad0dd7f7ba","Type":"ContainerDied","Data":"ac38515267e8d8578d086ad0d08b636c7c1817a124c41f2614dc37c9306394ab"} Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.215906 4728 scope.go:117] "RemoveContainer" containerID="654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.228190 4728 scope.go:117] "RemoveContainer" containerID="a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.237656 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mtcb"] Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.241400 4728 scope.go:117] "RemoveContainer" containerID="f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.244627 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mtcb"] Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.270870 4728 scope.go:117] "RemoveContainer" containerID="654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1" Jan 25 05:41:28 crc kubenswrapper[4728]: E0125 05:41:28.271212 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1\": container with ID starting with 654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1 not found: ID does not exist" containerID="654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.271243 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1"} err="failed to get container status \"654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1\": rpc error: code = NotFound desc = could not find container \"654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1\": container with ID starting with 654df97bcb21b31c59fa06341c007c77c462bea9310597498f2abd62677ac7d1 not found: ID does not exist" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.271264 4728 scope.go:117] "RemoveContainer" containerID="a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4" Jan 25 05:41:28 crc kubenswrapper[4728]: E0125 05:41:28.271562 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4\": container with ID starting with a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4 not found: ID does not exist" containerID="a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.271597 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4"} err="failed to get container status \"a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4\": rpc error: code = NotFound desc = could not find container \"a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4\": container with ID starting with a0bf2a0ce6358d1cd5974b57d399fb6672fecd4d581c5c894141681ab51ba0f4 not found: ID does not exist" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.271620 4728 scope.go:117] "RemoveContainer" containerID="f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97" Jan 25 05:41:28 crc kubenswrapper[4728]: E0125 05:41:28.271859 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97\": container with ID starting with f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97 not found: ID does not exist" containerID="f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.271882 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97"} err="failed to get container status \"f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97\": rpc error: code = NotFound desc = could not find container \"f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97\": container with ID starting with f5f6d3dc6fca1a73506bba2766928e3b7ba14a8b5e5122ab695a03d7d2d13b97 not found: ID does not exist" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.447821 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdckc"] Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.448432 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pdckc" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" containerName="registry-server" containerID="cri-o://97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d" gracePeriod=2 Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.820807 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.980044 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-utilities\") pod \"aa46f651-22c1-40df-9843-622d15eb26e7\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.980118 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nbbb\" (UniqueName: \"kubernetes.io/projected/aa46f651-22c1-40df-9843-622d15eb26e7-kube-api-access-6nbbb\") pod \"aa46f651-22c1-40df-9843-622d15eb26e7\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.980153 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-catalog-content\") pod \"aa46f651-22c1-40df-9843-622d15eb26e7\" (UID: \"aa46f651-22c1-40df-9843-622d15eb26e7\") " Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.980921 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-utilities" (OuterVolumeSpecName: "utilities") pod "aa46f651-22c1-40df-9843-622d15eb26e7" (UID: "aa46f651-22c1-40df-9843-622d15eb26e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:41:28 crc kubenswrapper[4728]: I0125 05:41:28.983715 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa46f651-22c1-40df-9843-622d15eb26e7-kube-api-access-6nbbb" (OuterVolumeSpecName: "kube-api-access-6nbbb") pod "aa46f651-22c1-40df-9843-622d15eb26e7" (UID: "aa46f651-22c1-40df-9843-622d15eb26e7"). InnerVolumeSpecName "kube-api-access-6nbbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.062028 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa46f651-22c1-40df-9843-622d15eb26e7" (UID: "aa46f651-22c1-40df-9843-622d15eb26e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.081763 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nbbb\" (UniqueName: \"kubernetes.io/projected/aa46f651-22c1-40df-9843-622d15eb26e7-kube-api-access-6nbbb\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.081790 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.081801 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa46f651-22c1-40df-9843-622d15eb26e7-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.221554 4728 generic.go:334] "Generic (PLEG): container finished" podID="aa46f651-22c1-40df-9843-622d15eb26e7" containerID="97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d" exitCode=0 Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.221607 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdckc" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.221633 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdckc" event={"ID":"aa46f651-22c1-40df-9843-622d15eb26e7","Type":"ContainerDied","Data":"97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d"} Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.221679 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdckc" event={"ID":"aa46f651-22c1-40df-9843-622d15eb26e7","Type":"ContainerDied","Data":"8aae1073866590482452bb019c89e6f2fd1a3cfd12143193bc0e6ee12c0fb6e7"} Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.221699 4728 scope.go:117] "RemoveContainer" containerID="97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.233212 4728 scope.go:117] "RemoveContainer" containerID="e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.245249 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdckc"] Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.247163 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pdckc"] Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.248095 4728 scope.go:117] "RemoveContainer" containerID="df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.271582 4728 scope.go:117] "RemoveContainer" containerID="97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d" Jan 25 05:41:29 crc kubenswrapper[4728]: E0125 05:41:29.271836 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d\": container with ID starting with 97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d not found: ID does not exist" containerID="97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.271870 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d"} err="failed to get container status \"97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d\": rpc error: code = NotFound desc = could not find container \"97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d\": container with ID starting with 97ce1708775e816d26173c7aaab3c0c46b9dba3f8af205e1a697ae9e75d4977d not found: ID does not exist" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.271897 4728 scope.go:117] "RemoveContainer" containerID="e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8" Jan 25 05:41:29 crc kubenswrapper[4728]: E0125 05:41:29.272132 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8\": container with ID starting with e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8 not found: ID does not exist" containerID="e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.272162 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8"} err="failed to get container status \"e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8\": rpc error: code = NotFound desc = could not find container \"e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8\": container with ID starting with e32bbfaed7b8f07fd93d64a6ae835cb33f431cf25707612b0a0e709610db05a8 not found: ID does not exist" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.272184 4728 scope.go:117] "RemoveContainer" containerID="df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b" Jan 25 05:41:29 crc kubenswrapper[4728]: E0125 05:41:29.272457 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b\": container with ID starting with df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b not found: ID does not exist" containerID="df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.272490 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b"} err="failed to get container status \"df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b\": rpc error: code = NotFound desc = could not find container \"df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b\": container with ID starting with df9d770cacc7fbfe0aeb6cf43abb0125703e32319fb00b39ac8a69e52732816b not found: ID does not exist" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.343368 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" path="/var/lib/kubelet/pods/394acccd-e8b7-4180-b169-2bad0dd7f7ba/volumes" Jan 25 05:41:29 crc kubenswrapper[4728]: I0125 05:41:29.344198 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" path="/var/lib/kubelet/pods/aa46f651-22c1-40df-9843-622d15eb26e7/volumes" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.484557 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.485730 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerName="extract-content" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.485815 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerName="extract-content" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.485873 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.485927 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.485986 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486033 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486085 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerName="extract-utilities" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486143 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerName="extract-utilities" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486195 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486238 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486298 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66e5e50-c238-476c-b592-afee310cbda7" containerName="pruner" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486369 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66e5e50-c238-476c-b592-afee310cbda7" containerName="pruner" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486420 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerName="extract-content" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486462 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerName="extract-content" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486508 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerName="extract-utilities" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486551 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerName="extract-utilities" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486655 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerName="extract-utilities" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486704 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerName="extract-utilities" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486768 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5895593f-451c-4528-965d-6696bee97a45" containerName="pruner" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486819 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5895593f-451c-4528-965d-6696bee97a45" containerName="pruner" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486865 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" containerName="extract-content" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.486913 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" containerName="extract-content" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.486967 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487011 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.487062 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerName="extract-content" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487121 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerName="extract-content" Jan 25 05:41:33 crc kubenswrapper[4728]: E0125 05:41:33.487177 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" containerName="extract-utilities" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487221 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" containerName="extract-utilities" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487378 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="394acccd-e8b7-4180-b169-2bad0dd7f7ba" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487457 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="16045e91-fc2e-4fc5-8cf9-a40bae675e7d" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487507 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="662e56c9-81a2-457b-8448-8cea4a0005c2" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487551 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa46f651-22c1-40df-9843-622d15eb26e7" containerName="registry-server" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487595 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66e5e50-c238-476c-b592-afee310cbda7" containerName="pruner" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487640 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5895593f-451c-4528-965d-6696bee97a45" containerName="pruner" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.487996 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.489504 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.490418 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.495285 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.633413 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8cb8843-3512-4a72-9944-df1e58134528-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e8cb8843-3512-4a72-9944-df1e58134528\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.633464 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8cb8843-3512-4a72-9944-df1e58134528-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e8cb8843-3512-4a72-9944-df1e58134528\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.734769 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8cb8843-3512-4a72-9944-df1e58134528-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e8cb8843-3512-4a72-9944-df1e58134528\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.734842 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8cb8843-3512-4a72-9944-df1e58134528-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e8cb8843-3512-4a72-9944-df1e58134528\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.734916 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8cb8843-3512-4a72-9944-df1e58134528-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e8cb8843-3512-4a72-9944-df1e58134528\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.759602 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8cb8843-3512-4a72-9944-df1e58134528-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e8cb8843-3512-4a72-9944-df1e58134528\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:33 crc kubenswrapper[4728]: I0125 05:41:33.800423 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:34 crc kubenswrapper[4728]: I0125 05:41:34.041856 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r62pr"] Jan 25 05:41:34 crc kubenswrapper[4728]: I0125 05:41:34.200068 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 25 05:41:34 crc kubenswrapper[4728]: W0125 05:41:34.207172 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode8cb8843_3512_4a72_9944_df1e58134528.slice/crio-8853d32e473343094c738f9140a790fd8fc16429eea93e3bcebdceb782caefe0 WatchSource:0}: Error finding container 8853d32e473343094c738f9140a790fd8fc16429eea93e3bcebdceb782caefe0: Status 404 returned error can't find the container with id 8853d32e473343094c738f9140a790fd8fc16429eea93e3bcebdceb782caefe0 Jan 25 05:41:34 crc kubenswrapper[4728]: I0125 05:41:34.247856 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e8cb8843-3512-4a72-9944-df1e58134528","Type":"ContainerStarted","Data":"8853d32e473343094c738f9140a790fd8fc16429eea93e3bcebdceb782caefe0"} Jan 25 05:41:35 crc kubenswrapper[4728]: I0125 05:41:35.256745 4728 generic.go:334] "Generic (PLEG): container finished" podID="e8cb8843-3512-4a72-9944-df1e58134528" containerID="636f23b70c0013945f23f307c7fc9d54f073f62b573cdcf8f37bb00ffb2610a9" exitCode=0 Jan 25 05:41:35 crc kubenswrapper[4728]: I0125 05:41:35.256849 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e8cb8843-3512-4a72-9944-df1e58134528","Type":"ContainerDied","Data":"636f23b70c0013945f23f307c7fc9d54f073f62b573cdcf8f37bb00ffb2610a9"} Jan 25 05:41:36 crc kubenswrapper[4728]: I0125 05:41:36.539150 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:36 crc kubenswrapper[4728]: I0125 05:41:36.667411 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8cb8843-3512-4a72-9944-df1e58134528-kubelet-dir\") pod \"e8cb8843-3512-4a72-9944-df1e58134528\" (UID: \"e8cb8843-3512-4a72-9944-df1e58134528\") " Jan 25 05:41:36 crc kubenswrapper[4728]: I0125 05:41:36.667552 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8cb8843-3512-4a72-9944-df1e58134528-kube-api-access\") pod \"e8cb8843-3512-4a72-9944-df1e58134528\" (UID: \"e8cb8843-3512-4a72-9944-df1e58134528\") " Jan 25 05:41:36 crc kubenswrapper[4728]: I0125 05:41:36.667843 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8cb8843-3512-4a72-9944-df1e58134528-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e8cb8843-3512-4a72-9944-df1e58134528" (UID: "e8cb8843-3512-4a72-9944-df1e58134528"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:41:36 crc kubenswrapper[4728]: I0125 05:41:36.673590 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8cb8843-3512-4a72-9944-df1e58134528-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e8cb8843-3512-4a72-9944-df1e58134528" (UID: "e8cb8843-3512-4a72-9944-df1e58134528"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:41:36 crc kubenswrapper[4728]: I0125 05:41:36.769456 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8cb8843-3512-4a72-9944-df1e58134528-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:36 crc kubenswrapper[4728]: I0125 05:41:36.769499 4728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8cb8843-3512-4a72-9944-df1e58134528-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:37 crc kubenswrapper[4728]: I0125 05:41:37.269414 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e8cb8843-3512-4a72-9944-df1e58134528","Type":"ContainerDied","Data":"8853d32e473343094c738f9140a790fd8fc16429eea93e3bcebdceb782caefe0"} Jan 25 05:41:37 crc kubenswrapper[4728]: I0125 05:41:37.269455 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8853d32e473343094c738f9140a790fd8fc16429eea93e3bcebdceb782caefe0" Jan 25 05:41:37 crc kubenswrapper[4728]: I0125 05:41:37.269521 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.082291 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 25 05:41:38 crc kubenswrapper[4728]: E0125 05:41:38.082522 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8cb8843-3512-4a72-9944-df1e58134528" containerName="pruner" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.082535 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cb8843-3512-4a72-9944-df1e58134528" containerName="pruner" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.082620 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8cb8843-3512-4a72-9944-df1e58134528" containerName="pruner" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.082963 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.084511 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.084808 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.092939 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.184849 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-var-lock\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.185022 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kube-api-access\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.185171 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.286587 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.286688 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-var-lock\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.286728 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kube-api-access\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.286761 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.286830 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-var-lock\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.302791 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kube-api-access\") pod \"installer-9-crc\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.395042 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:41:38 crc kubenswrapper[4728]: I0125 05:41:38.778617 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 25 05:41:39 crc kubenswrapper[4728]: I0125 05:41:39.282020 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9","Type":"ContainerStarted","Data":"9e3f037e1923a41346a1d30d944c83f9836eda1bf61c29e5370096595abe5add"} Jan 25 05:41:39 crc kubenswrapper[4728]: I0125 05:41:39.282381 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9","Type":"ContainerStarted","Data":"bcbbaf0ce637046ef8bbaae07f83cc4a3a5d6da6bfdb835db478705b84951250"} Jan 25 05:41:39 crc kubenswrapper[4728]: I0125 05:41:39.297668 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.297654001 podStartE2EDuration="1.297654001s" podCreationTimestamp="2026-01-25 05:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:41:39.294439081 +0000 UTC m=+190.330317061" watchObservedRunningTime="2026-01-25 05:41:39.297654001 +0000 UTC m=+190.333531980" Jan 25 05:41:42 crc kubenswrapper[4728]: I0125 05:41:42.899713 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:41:42 crc kubenswrapper[4728]: I0125 05:41:42.900108 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.061676 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" podUID="3a9908d7-639d-4f34-a59d-e0a03231a620" containerName="oauth-openshift" containerID="cri-o://24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9" gracePeriod=15 Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.358412 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.383364 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-phpgg"] Jan 25 05:41:59 crc kubenswrapper[4728]: E0125 05:41:59.383587 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9908d7-639d-4f34-a59d-e0a03231a620" containerName="oauth-openshift" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.383599 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9908d7-639d-4f34-a59d-e0a03231a620" containerName="oauth-openshift" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.383703 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9908d7-639d-4f34-a59d-e0a03231a620" containerName="oauth-openshift" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.384099 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.390844 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-phpgg"] Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.397227 4728 generic.go:334] "Generic (PLEG): container finished" podID="3a9908d7-639d-4f34-a59d-e0a03231a620" containerID="24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9" exitCode=0 Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.397262 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.397270 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" event={"ID":"3a9908d7-639d-4f34-a59d-e0a03231a620","Type":"ContainerDied","Data":"24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9"} Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.397300 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r62pr" event={"ID":"3a9908d7-639d-4f34-a59d-e0a03231a620","Type":"ContainerDied","Data":"93eee35971ec372c3de4487632a075752eae4c0c8108ebbb00f740b1b5e6d077"} Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.397333 4728 scope.go:117] "RemoveContainer" containerID="24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.410984 4728 scope.go:117] "RemoveContainer" containerID="24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9" Jan 25 05:41:59 crc kubenswrapper[4728]: E0125 05:41:59.411285 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9\": container with ID starting with 24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9 not found: ID does not exist" containerID="24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.411336 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9"} err="failed to get container status \"24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9\": rpc error: code = NotFound desc = could not find container \"24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9\": container with ID starting with 24b5eccd822c22f8f297f50642a114e389570b235921f3a050d44fba3d6ac0d9 not found: ID does not exist" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513382 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-idp-0-file-data\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513434 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-dir\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513465 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-router-certs\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513490 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-trusted-ca-bundle\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513518 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-session\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513548 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-provider-selection\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513579 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-login\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513606 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-serving-cert\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513624 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-error\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513682 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-cliconfig\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513705 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-policies\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513721 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6whtr\" (UniqueName: \"kubernetes.io/projected/3a9908d7-639d-4f34-a59d-e0a03231a620-kube-api-access-6whtr\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513752 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-ocp-branding-template\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513776 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-service-ca\") pod \"3a9908d7-639d-4f34-a59d-e0a03231a620\" (UID: \"3a9908d7-639d-4f34-a59d-e0a03231a620\") " Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513938 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-router-certs\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.513983 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514002 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514020 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7pfh\" (UniqueName: \"kubernetes.io/projected/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-kube-api-access-q7pfh\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514046 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-service-ca\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514078 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-session\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514110 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514126 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-policies\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514151 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514166 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-dir\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514184 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514206 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-error\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514229 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-login\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514250 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514259 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514487 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514899 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.514971 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.515517 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.518884 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.518933 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9908d7-639d-4f34-a59d-e0a03231a620-kube-api-access-6whtr" (OuterVolumeSpecName: "kube-api-access-6whtr") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "kube-api-access-6whtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.519196 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.519429 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.519728 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.519837 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.520036 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.520201 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.520370 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3a9908d7-639d-4f34-a59d-e0a03231a620" (UID: "3a9908d7-639d-4f34-a59d-e0a03231a620"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615262 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615337 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7pfh\" (UniqueName: \"kubernetes.io/projected/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-kube-api-access-q7pfh\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615379 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-service-ca\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615412 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-session\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615448 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615468 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-policies\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615485 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615512 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-dir\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615530 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615554 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-error\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615572 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-login\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615594 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615616 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-router-certs\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615637 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615683 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615695 4728 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615705 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6whtr\" (UniqueName: \"kubernetes.io/projected/3a9908d7-639d-4f34-a59d-e0a03231a620-kube-api-access-6whtr\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615715 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615724 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615734 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615746 4728 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a9908d7-639d-4f34-a59d-e0a03231a620-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615755 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615764 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615773 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615784 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615793 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615804 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.615814 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3a9908d7-639d-4f34-a59d-e0a03231a620-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.616083 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-service-ca\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.616152 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-dir\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.616496 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.616612 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-policies\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.616728 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.618593 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-session\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.618694 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-error\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.618844 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.619054 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.619699 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-router-certs\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.619746 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.619860 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-login\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.619986 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.628634 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7pfh\" (UniqueName: \"kubernetes.io/projected/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-kube-api-access-q7pfh\") pod \"oauth-openshift-76f84477b-phpgg\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.696455 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.719454 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r62pr"] Jan 25 05:41:59 crc kubenswrapper[4728]: I0125 05:41:59.721921 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r62pr"] Jan 25 05:42:00 crc kubenswrapper[4728]: I0125 05:42:00.042401 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-phpgg"] Jan 25 05:42:00 crc kubenswrapper[4728]: I0125 05:42:00.402110 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" event={"ID":"fb6303d1-5e9c-41a4-8923-5ea2ed774af8","Type":"ContainerStarted","Data":"5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87"} Jan 25 05:42:00 crc kubenswrapper[4728]: I0125 05:42:00.402157 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" event={"ID":"fb6303d1-5e9c-41a4-8923-5ea2ed774af8","Type":"ContainerStarted","Data":"f00bdf1e60cac46d27f8cff20e1eea86db902f00a3fc9e6022b1b69d0cded71f"} Jan 25 05:42:00 crc kubenswrapper[4728]: I0125 05:42:00.402361 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:42:00 crc kubenswrapper[4728]: I0125 05:42:00.418668 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" podStartSLOduration=26.418655369 podStartE2EDuration="26.418655369s" podCreationTimestamp="2026-01-25 05:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:42:00.416742023 +0000 UTC m=+211.452620003" watchObservedRunningTime="2026-01-25 05:42:00.418655369 +0000 UTC m=+211.454533348" Jan 25 05:42:00 crc kubenswrapper[4728]: I0125 05:42:00.514757 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:42:01 crc kubenswrapper[4728]: I0125 05:42:01.334524 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9908d7-639d-4f34-a59d-e0a03231a620" path="/var/lib/kubelet/pods/3a9908d7-639d-4f34-a59d-e0a03231a620/volumes" Jan 25 05:42:12 crc kubenswrapper[4728]: I0125 05:42:12.898765 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:42:12 crc kubenswrapper[4728]: I0125 05:42:12.898983 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:42:12 crc kubenswrapper[4728]: I0125 05:42:12.899021 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:42:12 crc kubenswrapper[4728]: I0125 05:42:12.899913 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 05:42:12 crc kubenswrapper[4728]: I0125 05:42:12.899965 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24" gracePeriod=600 Jan 25 05:42:13 crc kubenswrapper[4728]: I0125 05:42:13.455900 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24" exitCode=0 Jan 25 05:42:13 crc kubenswrapper[4728]: I0125 05:42:13.455980 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24"} Jan 25 05:42:13 crc kubenswrapper[4728]: I0125 05:42:13.456460 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"9c6bd49d9b17f994e00e405d6b8f16b6edd37de171ad4d27462fcbdcfc065a69"} Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.774417 4728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.775183 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776046 4728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776670 4728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776694 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e" gracePeriod=15 Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776697 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db" gracePeriod=15 Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776742 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77" gracePeriod=15 Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776763 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32" gracePeriod=15 Jan 25 05:42:16 crc kubenswrapper[4728]: E0125 05:42:16.776799 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776811 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 25 05:42:16 crc kubenswrapper[4728]: E0125 05:42:16.776820 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776826 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 25 05:42:16 crc kubenswrapper[4728]: E0125 05:42:16.776833 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776821 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17" gracePeriod=15 Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776840 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 25 05:42:16 crc kubenswrapper[4728]: E0125 05:42:16.776982 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.776998 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 25 05:42:16 crc kubenswrapper[4728]: E0125 05:42:16.777020 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.777026 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 25 05:42:16 crc kubenswrapper[4728]: E0125 05:42:16.777033 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.777038 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.777236 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.777247 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.777258 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.777264 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.777271 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.805007 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.889210 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.889499 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.889621 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.889710 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.889804 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.889904 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.890023 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.890127 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991121 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991163 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991185 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991214 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991244 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991270 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991285 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991298 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991387 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991420 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991441 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991460 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991479 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991495 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991514 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:16 crc kubenswrapper[4728]: I0125 05:42:16.991552 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.108263 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:42:17 crc kubenswrapper[4728]: W0125 05:42:17.122259 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c77259e86f373b05b730c12e26c6bdb8a638a45206beab40fcc668a04aaf2d41 WatchSource:0}: Error finding container c77259e86f373b05b730c12e26c6bdb8a638a45206beab40fcc668a04aaf2d41: Status 404 returned error can't find the container with id c77259e86f373b05b730c12e26c6bdb8a638a45206beab40fcc668a04aaf2d41 Jan 25 05:42:17 crc kubenswrapper[4728]: E0125 05:42:17.124456 4728 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188de2ed395c79c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-25 05:42:17.123961287 +0000 UTC m=+228.159839267,LastTimestamp:2026-01-25 05:42:17.123961287 +0000 UTC m=+228.159839267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.474228 4728 generic.go:334] "Generic (PLEG): container finished" podID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" containerID="9e3f037e1923a41346a1d30d944c83f9836eda1bf61c29e5370096595abe5add" exitCode=0 Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.474313 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9","Type":"ContainerDied","Data":"9e3f037e1923a41346a1d30d944c83f9836eda1bf61c29e5370096595abe5add"} Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.474919 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.475189 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.475862 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a97390983a8bb2e6b486b2a25ea92bbea6eea27ff1514fd0c13a1fa36ee792c8"} Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.475888 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c77259e86f373b05b730c12e26c6bdb8a638a45206beab40fcc668a04aaf2d41"} Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.476451 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.476638 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.478645 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.479198 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db" exitCode=0 Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.479218 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77" exitCode=0 Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.479228 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e" exitCode=0 Jan 25 05:42:17 crc kubenswrapper[4728]: I0125 05:42:17.479236 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17" exitCode=2 Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.671700 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.672500 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.672799 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.809595 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kubelet-dir\") pod \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.809931 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" (UID: "b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.810091 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kube-api-access\") pod \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.810118 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-var-lock\") pod \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\" (UID: \"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9\") " Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.810272 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-var-lock" (OuterVolumeSpecName: "var-lock") pod "b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" (UID: "b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.810404 4728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-var-lock\") on node \"crc\" DevicePath \"\"" Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.810418 4728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.815514 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" (UID: "b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:42:18 crc kubenswrapper[4728]: I0125 05:42:18.911367 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.134095 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.135011 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.135481 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.135840 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.136170 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214201 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214266 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214516 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214559 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214648 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214698 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214805 4728 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214829 4728 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.214842 4728 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.330879 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.331281 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.331620 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.342414 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.492730 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.493412 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32" exitCode=0 Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.493452 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.493502 4728 scope.go:117] "RemoveContainer" containerID="9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.493874 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.494065 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.494244 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.496055 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.496040 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9","Type":"ContainerDied","Data":"bcbbaf0ce637046ef8bbaae07f83cc4a3a5d6da6bfdb835db478705b84951250"} Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.496118 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcbbaf0ce637046ef8bbaae07f83cc4a3a5d6da6bfdb835db478705b84951250" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.496178 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.496449 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.496746 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.498209 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.498518 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.498710 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.506202 4728 scope.go:117] "RemoveContainer" containerID="13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.515021 4728 scope.go:117] "RemoveContainer" containerID="42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.529226 4728 scope.go:117] "RemoveContainer" containerID="2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.539466 4728 scope.go:117] "RemoveContainer" containerID="d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.550702 4728 scope.go:117] "RemoveContainer" containerID="83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.563114 4728 scope.go:117] "RemoveContainer" containerID="9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db" Jan 25 05:42:19 crc kubenswrapper[4728]: E0125 05:42:19.563494 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\": container with ID starting with 9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db not found: ID does not exist" containerID="9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.563526 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db"} err="failed to get container status \"9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\": rpc error: code = NotFound desc = could not find container \"9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db\": container with ID starting with 9845a17b467152addbee22376ed2e2bbb879687447f8bfe2a3337bc873aa46db not found: ID does not exist" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.563548 4728 scope.go:117] "RemoveContainer" containerID="13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77" Jan 25 05:42:19 crc kubenswrapper[4728]: E0125 05:42:19.563792 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\": container with ID starting with 13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77 not found: ID does not exist" containerID="13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.563818 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77"} err="failed to get container status \"13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\": rpc error: code = NotFound desc = could not find container \"13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77\": container with ID starting with 13774ac6a75ce890898559e79b0ae95f639eecd686a0242faaeb34ba86a65c77 not found: ID does not exist" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.563837 4728 scope.go:117] "RemoveContainer" containerID="42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e" Jan 25 05:42:19 crc kubenswrapper[4728]: E0125 05:42:19.564069 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\": container with ID starting with 42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e not found: ID does not exist" containerID="42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.564097 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e"} err="failed to get container status \"42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\": rpc error: code = NotFound desc = could not find container \"42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e\": container with ID starting with 42ee2803816d9636a3f534f3f374475b45fc96004205d678506d6247df99569e not found: ID does not exist" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.564117 4728 scope.go:117] "RemoveContainer" containerID="2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17" Jan 25 05:42:19 crc kubenswrapper[4728]: E0125 05:42:19.564381 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\": container with ID starting with 2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17 not found: ID does not exist" containerID="2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.564404 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17"} err="failed to get container status \"2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\": rpc error: code = NotFound desc = could not find container \"2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17\": container with ID starting with 2123c624daf687127ce444d9356dc3f858e9f64d5b67cd9415e4ebf5fea7dc17 not found: ID does not exist" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.564422 4728 scope.go:117] "RemoveContainer" containerID="d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32" Jan 25 05:42:19 crc kubenswrapper[4728]: E0125 05:42:19.564745 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\": container with ID starting with d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32 not found: ID does not exist" containerID="d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.564769 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32"} err="failed to get container status \"d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\": rpc error: code = NotFound desc = could not find container \"d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32\": container with ID starting with d43a0a6b8bebbe070598954758a0fdfdfa0dbbf50f010dca3cb5e0cf75e42f32 not found: ID does not exist" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.564788 4728 scope.go:117] "RemoveContainer" containerID="83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd" Jan 25 05:42:19 crc kubenswrapper[4728]: E0125 05:42:19.564993 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\": container with ID starting with 83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd not found: ID does not exist" containerID="83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd" Jan 25 05:42:19 crc kubenswrapper[4728]: I0125 05:42:19.565017 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd"} err="failed to get container status \"83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\": rpc error: code = NotFound desc = could not find container \"83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd\": container with ID starting with 83b12d6d92d3f20054a05745347f7b69f13d01c8bda655c2c26481ecff7706dd not found: ID does not exist" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.230065 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.230824 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.231297 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.231914 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.232229 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:24 crc kubenswrapper[4728]: I0125 05:42:24.232272 4728 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.232536 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="200ms" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.433046 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="400ms" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.535383 4728 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188de2ed395c79c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-25 05:42:17.123961287 +0000 UTC m=+228.159839267,LastTimestamp:2026-01-25 05:42:17.123961287 +0000 UTC m=+228.159839267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 25 05:42:24 crc kubenswrapper[4728]: E0125 05:42:24.834226 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="800ms" Jan 25 05:42:25 crc kubenswrapper[4728]: E0125 05:42:25.634983 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="1.6s" Jan 25 05:42:25 crc kubenswrapper[4728]: E0125 05:42:25.926133 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:42:25Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:42:25Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:42:25Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-25T05:42:25Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:25 crc kubenswrapper[4728]: E0125 05:42:25.926543 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:25 crc kubenswrapper[4728]: E0125 05:42:25.927009 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:25 crc kubenswrapper[4728]: E0125 05:42:25.927271 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:25 crc kubenswrapper[4728]: E0125 05:42:25.927596 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:25 crc kubenswrapper[4728]: E0125 05:42:25.927625 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 25 05:42:27 crc kubenswrapper[4728]: E0125 05:42:27.236783 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.50:6443: connect: connection refused" interval="3.2s" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.328050 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.328997 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.329236 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.342449 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.342584 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:27 crc kubenswrapper[4728]: E0125 05:42:27.344477 4728 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.344780 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.532575 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"01e4639eb782b06bbbf6538f4143c2c0a5edd7c59740cdd78c51dd22ff8e946e"} Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.532620 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2ed042a2bab9f7db7a0fe6f908cc0e4009c783d5b833cb6b59584c2a3fa805bb"} Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.532854 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.532868 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.533389 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:27 crc kubenswrapper[4728]: E0125 05:42:27.533454 4728 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:27 crc kubenswrapper[4728]: I0125 05:42:27.533656 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:28 crc kubenswrapper[4728]: I0125 05:42:28.538759 4728 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="01e4639eb782b06bbbf6538f4143c2c0a5edd7c59740cdd78c51dd22ff8e946e" exitCode=0 Jan 25 05:42:28 crc kubenswrapper[4728]: I0125 05:42:28.538807 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"01e4639eb782b06bbbf6538f4143c2c0a5edd7c59740cdd78c51dd22ff8e946e"} Jan 25 05:42:28 crc kubenswrapper[4728]: I0125 05:42:28.539055 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:28 crc kubenswrapper[4728]: I0125 05:42:28.539068 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:28 crc kubenswrapper[4728]: E0125 05:42:28.539359 4728 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:28 crc kubenswrapper[4728]: I0125 05:42:28.539371 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:28 crc kubenswrapper[4728]: I0125 05:42:28.539539 4728 status_manager.go:851] "Failed to get status for pod" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.50:6443: connect: connection refused" Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.544836 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"988d6c019ed6390061df8fad8a08a6847b2052929ca2d44553de7058f48c9028"} Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.545366 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cb7db58f4c6b3651fbfbb35da6fc767fa0799f80f8b35507a2f43960861f9c66"} Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.545381 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93b416a05d28e28ddd9465e439503963d0a82a2c7e14848ce41d72ac2649c7f1"} Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.545390 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"eb42dfad5d924ae70473ae156b5730d24ddbce9d563578094aa8384b734cf3f9"} Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.545400 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2003b3f3e9534618e58e6f7d13061dca8ac9a299b7247d870890031b94a716ed"} Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.545593 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.545649 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.545664 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.548196 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.548235 4728 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978" exitCode=1 Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.548260 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978"} Jan 25 05:42:29 crc kubenswrapper[4728]: I0125 05:42:29.548573 4728 scope.go:117] "RemoveContainer" containerID="f130229c9e80e402f25bd8779375c97789a8bee6782ab72f1de74c27438bd978" Jan 25 05:42:30 crc kubenswrapper[4728]: I0125 05:42:30.031026 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:42:30 crc kubenswrapper[4728]: I0125 05:42:30.563456 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 25 05:42:30 crc kubenswrapper[4728]: I0125 05:42:30.563514 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0b9289864f18f2dcaceb6354cbc951c6828e66660cb397d32f5596d637a1b223"} Jan 25 05:42:31 crc kubenswrapper[4728]: I0125 05:42:31.574212 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:42:32 crc kubenswrapper[4728]: I0125 05:42:32.345909 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:32 crc kubenswrapper[4728]: I0125 05:42:32.345947 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:32 crc kubenswrapper[4728]: I0125 05:42:32.350970 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:34 crc kubenswrapper[4728]: I0125 05:42:34.871381 4728 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:34 crc kubenswrapper[4728]: I0125 05:42:34.904970 4728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e1471708-b7b6-4eaa-a958-b93bf528021d" Jan 25 05:42:35 crc kubenswrapper[4728]: I0125 05:42:35.585532 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:35 crc kubenswrapper[4728]: I0125 05:42:35.585559 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:35 crc kubenswrapper[4728]: I0125 05:42:35.588737 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:35 crc kubenswrapper[4728]: I0125 05:42:35.588978 4728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e1471708-b7b6-4eaa-a958-b93bf528021d" Jan 25 05:42:36 crc kubenswrapper[4728]: I0125 05:42:36.589747 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:36 crc kubenswrapper[4728]: I0125 05:42:36.589794 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0a4369e-a82b-4d74-afeb-fc4b69b0057a" Jan 25 05:42:36 crc kubenswrapper[4728]: I0125 05:42:36.592716 4728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e1471708-b7b6-4eaa-a958-b93bf528021d" Jan 25 05:42:40 crc kubenswrapper[4728]: I0125 05:42:40.031700 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:42:40 crc kubenswrapper[4728]: I0125 05:42:40.036111 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:42:40 crc kubenswrapper[4728]: I0125 05:42:40.612554 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 25 05:42:41 crc kubenswrapper[4728]: I0125 05:42:41.610486 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 25 05:42:42 crc kubenswrapper[4728]: I0125 05:42:42.094585 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 25 05:42:42 crc kubenswrapper[4728]: I0125 05:42:42.221504 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 25 05:42:42 crc kubenswrapper[4728]: I0125 05:42:42.831458 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 25 05:42:42 crc kubenswrapper[4728]: I0125 05:42:42.927212 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 25 05:42:42 crc kubenswrapper[4728]: I0125 05:42:42.958079 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 25 05:42:42 crc kubenswrapper[4728]: I0125 05:42:42.998174 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 25 05:42:43 crc kubenswrapper[4728]: I0125 05:42:43.124262 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 25 05:42:43 crc kubenswrapper[4728]: I0125 05:42:43.536645 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 25 05:42:43 crc kubenswrapper[4728]: I0125 05:42:43.986857 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 25 05:42:44 crc kubenswrapper[4728]: I0125 05:42:44.248083 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 25 05:42:44 crc kubenswrapper[4728]: I0125 05:42:44.567591 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 25 05:42:44 crc kubenswrapper[4728]: I0125 05:42:44.719374 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 25 05:42:45 crc kubenswrapper[4728]: I0125 05:42:45.990806 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 25 05:42:46 crc kubenswrapper[4728]: I0125 05:42:46.183409 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 25 05:42:46 crc kubenswrapper[4728]: I0125 05:42:46.293103 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 25 05:42:46 crc kubenswrapper[4728]: I0125 05:42:46.506315 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 25 05:42:46 crc kubenswrapper[4728]: I0125 05:42:46.540236 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 25 05:42:46 crc kubenswrapper[4728]: I0125 05:42:46.676500 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 25 05:42:46 crc kubenswrapper[4728]: I0125 05:42:46.951383 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 25 05:42:47 crc kubenswrapper[4728]: I0125 05:42:47.472664 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 25 05:42:47 crc kubenswrapper[4728]: I0125 05:42:47.587167 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 25 05:42:47 crc kubenswrapper[4728]: I0125 05:42:47.916954 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 25 05:42:47 crc kubenswrapper[4728]: I0125 05:42:47.993995 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 25 05:42:48 crc kubenswrapper[4728]: I0125 05:42:48.304669 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 25 05:42:48 crc kubenswrapper[4728]: I0125 05:42:48.465500 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 25 05:42:48 crc kubenswrapper[4728]: I0125 05:42:48.590179 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 25 05:42:48 crc kubenswrapper[4728]: I0125 05:42:48.642750 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 25 05:42:48 crc kubenswrapper[4728]: I0125 05:42:48.843008 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 25 05:42:48 crc kubenswrapper[4728]: I0125 05:42:48.902945 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 25 05:42:48 crc kubenswrapper[4728]: I0125 05:42:48.923718 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 25 05:42:49 crc kubenswrapper[4728]: I0125 05:42:49.594952 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 25 05:42:49 crc kubenswrapper[4728]: I0125 05:42:49.680340 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 25 05:42:49 crc kubenswrapper[4728]: I0125 05:42:49.718568 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 25 05:42:49 crc kubenswrapper[4728]: I0125 05:42:49.805914 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 25 05:42:49 crc kubenswrapper[4728]: I0125 05:42:49.806849 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 25 05:42:49 crc kubenswrapper[4728]: I0125 05:42:49.921128 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 25 05:42:49 crc kubenswrapper[4728]: I0125 05:42:49.977042 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.064615 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.064989 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.117859 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.300720 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.465631 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.476852 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.496623 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.602722 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.637523 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.651089 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.722511 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.794075 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.856338 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 25 05:42:50 crc kubenswrapper[4728]: I0125 05:42:50.908117 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.004871 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.141858 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.258981 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.324944 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.590438 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.615838 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.704991 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.769058 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 25 05:42:51 crc kubenswrapper[4728]: I0125 05:42:51.886630 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.052025 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.094348 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.128649 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.146847 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.187761 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.423664 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.440261 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.498707 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.541554 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.564796 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.574510 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.580383 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.608101 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.639087 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.640236 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.692087 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.705444 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.963161 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 25 05:42:52 crc kubenswrapper[4728]: I0125 05:42:52.983662 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.008342 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.062661 4728 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.083855 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.115371 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.124635 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.168816 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.182868 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.203786 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.355027 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.408011 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.458410 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.462242 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.479242 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.546699 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.625058 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.678596 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.835546 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.930140 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 25 05:42:53 crc kubenswrapper[4728]: I0125 05:42:53.980180 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.015507 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.018243 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.101157 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.115651 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.148303 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.181353 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.299609 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.302519 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.328781 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.382806 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.461376 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.519830 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.547540 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.548085 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.612514 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.613480 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.663926 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.664250 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.673368 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.752816 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.872966 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.911135 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.913622 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.999642 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 25 05:42:54 crc kubenswrapper[4728]: I0125 05:42:54.999725 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.040464 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.071579 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.113550 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.119537 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.126494 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.130186 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.155173 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.170392 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.194165 4728 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.196250 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.196231286 podStartE2EDuration="39.196231286s" podCreationTimestamp="2026-01-25 05:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:42:34.881406314 +0000 UTC m=+245.917284294" watchObservedRunningTime="2026-01-25 05:42:55.196231286 +0000 UTC m=+266.232109267" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.197996 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.198039 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.203144 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.220104 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.220087711 podStartE2EDuration="21.220087711s" podCreationTimestamp="2026-01-25 05:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:42:55.218914029 +0000 UTC m=+266.254792009" watchObservedRunningTime="2026-01-25 05:42:55.220087711 +0000 UTC m=+266.255965691" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.417116 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.450722 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.507944 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.568591 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.618198 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.681779 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.685782 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.744870 4728 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.799765 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.806391 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.830215 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.899563 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.899781 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 25 05:42:55 crc kubenswrapper[4728]: I0125 05:42:55.978661 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.049981 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.106534 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.183809 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.187890 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.264815 4728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.265273 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://a97390983a8bb2e6b486b2a25ea92bbea6eea27ff1514fd0c13a1fa36ee792c8" gracePeriod=5 Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.272103 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.344657 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.367433 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.391146 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.423270 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.456137 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.465398 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.465970 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.636153 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.725351 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.765705 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.809878 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.826666 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.890042 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.958545 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.964204 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 25 05:42:56 crc kubenswrapper[4728]: I0125 05:42:56.970700 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.066691 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.094659 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.221050 4728 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.305343 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.321222 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.354586 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.380175 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.464963 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.555437 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.564189 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.599355 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.601010 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.622728 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.636096 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.677093 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.745892 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.771374 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.810466 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.865208 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 25 05:42:57 crc kubenswrapper[4728]: I0125 05:42:57.941818 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.027111 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.075284 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.179185 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.223406 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.276423 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.303062 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.329011 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.352552 4728 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.404873 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.473085 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.513851 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.680585 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.711913 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.773437 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.859378 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.934695 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.962774 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 25 05:42:58 crc kubenswrapper[4728]: I0125 05:42:58.997191 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.137792 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.346555 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.417044 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.535185 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.597744 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.608302 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.743890 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.828566 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.846568 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.916167 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 25 05:42:59 crc kubenswrapper[4728]: I0125 05:42:59.973401 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.066386 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.082895 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.120785 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.290067 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.322157 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.383097 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.415805 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.437813 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.487988 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.505213 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.589023 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.644218 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.674848 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.726453 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.755891 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.801648 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.838252 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7c9g"] Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.838542 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d7c9g" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="registry-server" containerID="cri-o://9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813" gracePeriod=30 Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.841999 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9n9h8"] Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.842520 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9n9h8" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="registry-server" containerID="cri-o://1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb" gracePeriod=30 Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.849393 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb65z"] Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.849505 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" podUID="95303aa9-3fb0-48a8-8df8-0f601653ac48" containerName="marketplace-operator" containerID="cri-o://6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72" gracePeriod=30 Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.864226 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-785sb"] Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.864638 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-785sb" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerName="registry-server" containerID="cri-o://05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc" gracePeriod=30 Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.866546 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6zfs"] Jan 25 05:43:00 crc kubenswrapper[4728]: I0125 05:43:00.866765 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p6zfs" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerName="registry-server" containerID="cri-o://5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff" gracePeriod=30 Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.023538 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813 is running failed: container process not found" containerID="9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813" cmd=["grpc_health_probe","-addr=:50051"] Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.024025 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813 is running failed: container process not found" containerID="9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813" cmd=["grpc_health_probe","-addr=:50051"] Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.024533 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813 is running failed: container process not found" containerID="9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813" cmd=["grpc_health_probe","-addr=:50051"] Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.024591 4728 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-d7c9g" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="registry-server" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.073045 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.240500 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb is running failed: container process not found" containerID="1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb" cmd=["grpc_health_probe","-addr=:50051"] Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.240847 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb is running failed: container process not found" containerID="1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb" cmd=["grpc_health_probe","-addr=:50051"] Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.241225 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb is running failed: container process not found" containerID="1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb" cmd=["grpc_health_probe","-addr=:50051"] Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.241261 4728 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-9n9h8" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="registry-server" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.288583 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.291719 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.295941 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.301915 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.350496 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.359822 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.377716 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-conmon-a97390983a8bb2e6b486b2a25ea92bbea6eea27ff1514fd0c13a1fa36ee792c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a97390983a8bb2e6b486b2a25ea92bbea6eea27ff1514fd0c13a1fa36ee792c8.scope\": RecentStats: unable to find data in memory cache]" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.412829 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491675 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-catalog-content\") pod \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491731 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-operator-metrics\") pod \"95303aa9-3fb0-48a8-8df8-0f601653ac48\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491782 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x8b6\" (UniqueName: \"kubernetes.io/projected/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-kube-api-access-5x8b6\") pod \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491803 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-catalog-content\") pod \"37f3e115-0b8f-473d-9cb3-dea0a2685889\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491847 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-utilities\") pod \"37f3e115-0b8f-473d-9cb3-dea0a2685889\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491882 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-utilities\") pod \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491923 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-catalog-content\") pod \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491958 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-utilities\") pod \"b59c4b5c-3733-4fed-8410-57526ce048b2\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491981 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8gsf\" (UniqueName: \"kubernetes.io/projected/95303aa9-3fb0-48a8-8df8-0f601653ac48-kube-api-access-s8gsf\") pod \"95303aa9-3fb0-48a8-8df8-0f601653ac48\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.491999 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tv2k\" (UniqueName: \"kubernetes.io/projected/b59c4b5c-3733-4fed-8410-57526ce048b2-kube-api-access-4tv2k\") pod \"b59c4b5c-3733-4fed-8410-57526ce048b2\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.492018 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-utilities\") pod \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\" (UID: \"d6a82d6e-706c-4b85-81ca-9bf8fb99d904\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.492045 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-catalog-content\") pod \"b59c4b5c-3733-4fed-8410-57526ce048b2\" (UID: \"b59c4b5c-3733-4fed-8410-57526ce048b2\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.492066 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-trusted-ca\") pod \"95303aa9-3fb0-48a8-8df8-0f601653ac48\" (UID: \"95303aa9-3fb0-48a8-8df8-0f601653ac48\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.492090 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx2b9\" (UniqueName: \"kubernetes.io/projected/37f3e115-0b8f-473d-9cb3-dea0a2685889-kube-api-access-hx2b9\") pod \"37f3e115-0b8f-473d-9cb3-dea0a2685889\" (UID: \"37f3e115-0b8f-473d-9cb3-dea0a2685889\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.492126 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv96k\" (UniqueName: \"kubernetes.io/projected/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-kube-api-access-wv96k\") pod \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\" (UID: \"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.492892 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-utilities" (OuterVolumeSpecName: "utilities") pod "63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" (UID: "63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.493154 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-utilities" (OuterVolumeSpecName: "utilities") pod "d6a82d6e-706c-4b85-81ca-9bf8fb99d904" (UID: "d6a82d6e-706c-4b85-81ca-9bf8fb99d904"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.493189 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "95303aa9-3fb0-48a8-8df8-0f601653ac48" (UID: "95303aa9-3fb0-48a8-8df8-0f601653ac48"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.493745 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-utilities" (OuterVolumeSpecName: "utilities") pod "37f3e115-0b8f-473d-9cb3-dea0a2685889" (UID: "37f3e115-0b8f-473d-9cb3-dea0a2685889"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.494181 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-utilities" (OuterVolumeSpecName: "utilities") pod "b59c4b5c-3733-4fed-8410-57526ce048b2" (UID: "b59c4b5c-3733-4fed-8410-57526ce048b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.498628 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-kube-api-access-wv96k" (OuterVolumeSpecName: "kube-api-access-wv96k") pod "63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" (UID: "63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55"). InnerVolumeSpecName "kube-api-access-wv96k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.498812 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59c4b5c-3733-4fed-8410-57526ce048b2-kube-api-access-4tv2k" (OuterVolumeSpecName: "kube-api-access-4tv2k") pod "b59c4b5c-3733-4fed-8410-57526ce048b2" (UID: "b59c4b5c-3733-4fed-8410-57526ce048b2"). InnerVolumeSpecName "kube-api-access-4tv2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.498986 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f3e115-0b8f-473d-9cb3-dea0a2685889-kube-api-access-hx2b9" (OuterVolumeSpecName: "kube-api-access-hx2b9") pod "37f3e115-0b8f-473d-9cb3-dea0a2685889" (UID: "37f3e115-0b8f-473d-9cb3-dea0a2685889"). InnerVolumeSpecName "kube-api-access-hx2b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.499080 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-kube-api-access-5x8b6" (OuterVolumeSpecName: "kube-api-access-5x8b6") pod "d6a82d6e-706c-4b85-81ca-9bf8fb99d904" (UID: "d6a82d6e-706c-4b85-81ca-9bf8fb99d904"). InnerVolumeSpecName "kube-api-access-5x8b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.499135 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "95303aa9-3fb0-48a8-8df8-0f601653ac48" (UID: "95303aa9-3fb0-48a8-8df8-0f601653ac48"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.499214 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95303aa9-3fb0-48a8-8df8-0f601653ac48-kube-api-access-s8gsf" (OuterVolumeSpecName: "kube-api-access-s8gsf") pod "95303aa9-3fb0-48a8-8df8-0f601653ac48" (UID: "95303aa9-3fb0-48a8-8df8-0f601653ac48"). InnerVolumeSpecName "kube-api-access-s8gsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.513595 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b59c4b5c-3733-4fed-8410-57526ce048b2" (UID: "b59c4b5c-3733-4fed-8410-57526ce048b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.533205 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37f3e115-0b8f-473d-9cb3-dea0a2685889" (UID: "37f3e115-0b8f-473d-9cb3-dea0a2685889"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.543119 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" (UID: "63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.588835 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6a82d6e-706c-4b85-81ca-9bf8fb99d904" (UID: "d6a82d6e-706c-4b85-81ca-9bf8fb99d904"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593558 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593589 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593600 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8gsf\" (UniqueName: \"kubernetes.io/projected/95303aa9-3fb0-48a8-8df8-0f601653ac48-kube-api-access-s8gsf\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593613 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tv2k\" (UniqueName: \"kubernetes.io/projected/b59c4b5c-3733-4fed-8410-57526ce048b2-kube-api-access-4tv2k\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593622 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593630 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b59c4b5c-3733-4fed-8410-57526ce048b2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593642 4728 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593651 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx2b9\" (UniqueName: \"kubernetes.io/projected/37f3e115-0b8f-473d-9cb3-dea0a2685889-kube-api-access-hx2b9\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593660 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv96k\" (UniqueName: \"kubernetes.io/projected/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-kube-api-access-wv96k\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593668 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593676 4728 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/95303aa9-3fb0-48a8-8df8-0f601653ac48-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593686 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x8b6\" (UniqueName: \"kubernetes.io/projected/d6a82d6e-706c-4b85-81ca-9bf8fb99d904-kube-api-access-5x8b6\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593694 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593702 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f3e115-0b8f-473d-9cb3-dea0a2685889-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.593710 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.703127 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.706036 4728 generic.go:334] "Generic (PLEG): container finished" podID="95303aa9-3fb0-48a8-8df8-0f601653ac48" containerID="6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72" exitCode=0 Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.706100 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" event={"ID":"95303aa9-3fb0-48a8-8df8-0f601653ac48","Type":"ContainerDied","Data":"6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.706181 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" event={"ID":"95303aa9-3fb0-48a8-8df8-0f601653ac48","Type":"ContainerDied","Data":"30cb1327d3b79775309161b1127dd15937b5fd2347dcb12170d3c9c768fdac64"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.706227 4728 scope.go:117] "RemoveContainer" containerID="6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.706688 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qb65z" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.709681 4728 generic.go:334] "Generic (PLEG): container finished" podID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerID="9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813" exitCode=0 Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.709758 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7c9g" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.709772 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7c9g" event={"ID":"37f3e115-0b8f-473d-9cb3-dea0a2685889","Type":"ContainerDied","Data":"9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.709838 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7c9g" event={"ID":"37f3e115-0b8f-473d-9cb3-dea0a2685889","Type":"ContainerDied","Data":"28303655ac5f500688365e9a862c5450b0cd778b52d97ae153a83d524a9055e1"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.711113 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.711152 4728 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="a97390983a8bb2e6b486b2a25ea92bbea6eea27ff1514fd0c13a1fa36ee792c8" exitCode=137 Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.713750 4728 generic.go:334] "Generic (PLEG): container finished" podID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerID="1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb" exitCode=0 Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.713806 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n9h8" event={"ID":"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55","Type":"ContainerDied","Data":"1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.713836 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n9h8" event={"ID":"63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55","Type":"ContainerDied","Data":"3c466404fd6f6b2ac8fbde73e9f68e1f32e535d55c183fc762f83210753a7e66"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.713885 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n9h8" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.715571 4728 generic.go:334] "Generic (PLEG): container finished" podID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerID="05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc" exitCode=0 Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.715614 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-785sb" event={"ID":"b59c4b5c-3733-4fed-8410-57526ce048b2","Type":"ContainerDied","Data":"05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.715631 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-785sb" event={"ID":"b59c4b5c-3733-4fed-8410-57526ce048b2","Type":"ContainerDied","Data":"36d0dc238edf0160a93f37629638e24e6f772c8132e4b1b6c71f2234c6590885"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.715706 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-785sb" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.718123 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerID="5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff" exitCode=0 Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.718153 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6zfs" event={"ID":"d6a82d6e-706c-4b85-81ca-9bf8fb99d904","Type":"ContainerDied","Data":"5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.718173 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6zfs" event={"ID":"d6a82d6e-706c-4b85-81ca-9bf8fb99d904","Type":"ContainerDied","Data":"30ff0604d63af93c092c9be80c39e72bc29af6fa8afc1ec00571ee73283066b7"} Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.718249 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6zfs" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.720843 4728 scope.go:117] "RemoveContainer" containerID="6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.721257 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72\": container with ID starting with 6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72 not found: ID does not exist" containerID="6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.721291 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72"} err="failed to get container status \"6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72\": rpc error: code = NotFound desc = could not find container \"6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72\": container with ID starting with 6a2ca7fc79fc5a925da0e27acf125acc6f2fc907a9d2567d30cb0730f6653e72 not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.721330 4728 scope.go:117] "RemoveContainer" containerID="9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.732992 4728 scope.go:117] "RemoveContainer" containerID="208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.738477 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7c9g"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.742714 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d7c9g"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.751237 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9n9h8"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.753396 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9n9h8"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.757076 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6zfs"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.760473 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p6zfs"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.765135 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb65z"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.765517 4728 scope.go:117] "RemoveContainer" containerID="6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.769335 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb65z"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.772526 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-785sb"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.774619 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-785sb"] Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.778680 4728 scope.go:117] "RemoveContainer" containerID="9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.779155 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813\": container with ID starting with 9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813 not found: ID does not exist" containerID="9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.779191 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813"} err="failed to get container status \"9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813\": rpc error: code = NotFound desc = could not find container \"9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813\": container with ID starting with 9b12260ed11df5b40f99c692138bf8d3c3be7361693f7c7856e5d17d2c41d813 not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.779226 4728 scope.go:117] "RemoveContainer" containerID="208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.779639 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4\": container with ID starting with 208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4 not found: ID does not exist" containerID="208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.779685 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4"} err="failed to get container status \"208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4\": rpc error: code = NotFound desc = could not find container \"208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4\": container with ID starting with 208474b3185a020e4025a20fd5b30a0171f490e370f501c28c63e64fd93689a4 not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.779701 4728 scope.go:117] "RemoveContainer" containerID="6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.779988 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057\": container with ID starting with 6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057 not found: ID does not exist" containerID="6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.780117 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057"} err="failed to get container status \"6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057\": rpc error: code = NotFound desc = could not find container \"6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057\": container with ID starting with 6c92d98c638a34e2c890663a9846333079642ee0e0005caa127151e40d765057 not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.780221 4728 scope.go:117] "RemoveContainer" containerID="1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.793030 4728 scope.go:117] "RemoveContainer" containerID="952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.805218 4728 scope.go:117] "RemoveContainer" containerID="bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.806730 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.806803 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.815353 4728 scope.go:117] "RemoveContainer" containerID="1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.815646 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb\": container with ID starting with 1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb not found: ID does not exist" containerID="1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.815682 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb"} err="failed to get container status \"1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb\": rpc error: code = NotFound desc = could not find container \"1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb\": container with ID starting with 1e9e79fc62b0f1514fb62dff51b2357e3aa050b3aeca0b6198ba769c1eaa0ccb not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.815737 4728 scope.go:117] "RemoveContainer" containerID="952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.816469 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c\": container with ID starting with 952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c not found: ID does not exist" containerID="952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.816502 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c"} err="failed to get container status \"952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c\": rpc error: code = NotFound desc = could not find container \"952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c\": container with ID starting with 952da9908a9af4a64f7ac4156e45e6547a9b15123cfb647d8ba1b21abd8cb69c not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.816526 4728 scope.go:117] "RemoveContainer" containerID="bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.816777 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0\": container with ID starting with bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0 not found: ID does not exist" containerID="bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.816807 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0"} err="failed to get container status \"bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0\": rpc error: code = NotFound desc = could not find container \"bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0\": container with ID starting with bcb83bb346b491c3f4ef294844700da57bcc959d50fa954a77849fe7526232f0 not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.816826 4728 scope.go:117] "RemoveContainer" containerID="05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.825596 4728 scope.go:117] "RemoveContainer" containerID="acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.835506 4728 scope.go:117] "RemoveContainer" containerID="d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.848203 4728 scope.go:117] "RemoveContainer" containerID="05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.848474 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc\": container with ID starting with 05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc not found: ID does not exist" containerID="05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.848501 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc"} err="failed to get container status \"05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc\": rpc error: code = NotFound desc = could not find container \"05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc\": container with ID starting with 05bbaf74be49c0f006e6899cb742297aaf9a24d9ccef5c4e51fae388067b7ecc not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.848518 4728 scope.go:117] "RemoveContainer" containerID="acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.848755 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474\": container with ID starting with acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474 not found: ID does not exist" containerID="acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.848782 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474"} err="failed to get container status \"acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474\": rpc error: code = NotFound desc = could not find container \"acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474\": container with ID starting with acfdf2abcb27ffa721ac5f953ed58e049f37fad6c589496df7d9adea3707e474 not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.848800 4728 scope.go:117] "RemoveContainer" containerID="d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.849022 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b\": container with ID starting with d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b not found: ID does not exist" containerID="d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.849052 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b"} err="failed to get container status \"d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b\": rpc error: code = NotFound desc = could not find container \"d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b\": container with ID starting with d534a12a970c8de3642c6ebb733ce66c9fe145a99be975f97590ce98e16fa15b not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.849070 4728 scope.go:117] "RemoveContainer" containerID="5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.858681 4728 scope.go:117] "RemoveContainer" containerID="ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.870852 4728 scope.go:117] "RemoveContainer" containerID="4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.879511 4728 scope.go:117] "RemoveContainer" containerID="5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.880429 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff\": container with ID starting with 5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff not found: ID does not exist" containerID="5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.880474 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff"} err="failed to get container status \"5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff\": rpc error: code = NotFound desc = could not find container \"5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff\": container with ID starting with 5db592eb4f2ab07b666bb16ebe58adf0480cd59c751a80f85bbc8d914daba4ff not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.880507 4728 scope.go:117] "RemoveContainer" containerID="ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.881240 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812\": container with ID starting with ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812 not found: ID does not exist" containerID="ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.881355 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812"} err="failed to get container status \"ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812\": rpc error: code = NotFound desc = could not find container \"ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812\": container with ID starting with ae8b5f2d621402d86dd3f606413c2143af44967836d854f4f4057049f0be6812 not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.881677 4728 scope.go:117] "RemoveContainer" containerID="4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d" Jan 25 05:43:01 crc kubenswrapper[4728]: E0125 05:43:01.882089 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d\": container with ID starting with 4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d not found: ID does not exist" containerID="4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.882226 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d"} err="failed to get container status \"4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d\": rpc error: code = NotFound desc = could not find container \"4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d\": container with ID starting with 4ce9527997dc0a5d29865ca09c6d548d2fe0087b6d250bd1f981d137c3d3ab5d not found: ID does not exist" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.936678 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.999817 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.999864 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.999887 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:43:01 crc kubenswrapper[4728]: I0125 05:43:01.999926 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:01.999945 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:01.999966 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:01.999970 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.000008 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.000113 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.000423 4728 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.000443 4728 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.000627 4728 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.000646 4728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.005952 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.061204 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.101429 4728 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.120522 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161115 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z726j"] Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161368 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161384 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161396 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="extract-utilities" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161402 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="extract-utilities" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161409 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="extract-utilities" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161414 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="extract-utilities" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161422 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95303aa9-3fb0-48a8-8df8-0f601653ac48" containerName="marketplace-operator" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161428 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="95303aa9-3fb0-48a8-8df8-0f601653ac48" containerName="marketplace-operator" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161436 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerName="extract-utilities" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161441 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerName="extract-utilities" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161447 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="extract-content" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161452 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="extract-content" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161459 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161464 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161471 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161477 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161483 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161489 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161498 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerName="extract-utilities" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161503 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerName="extract-utilities" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161512 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="extract-content" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161517 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="extract-content" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161527 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" containerName="installer" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161534 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" containerName="installer" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161541 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerName="extract-content" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161548 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerName="extract-content" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161554 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerName="extract-content" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161560 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerName="extract-content" Jan 25 05:43:02 crc kubenswrapper[4728]: E0125 05:43:02.161568 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161573 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161652 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161660 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161668 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ebc3f3-0aa9-4c3c-8e9b-e4e00d19e4f9" containerName="installer" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161675 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161682 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161690 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="95303aa9-3fb0-48a8-8df8-0f601653ac48" containerName="marketplace-operator" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.161698 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" containerName="registry-server" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.162114 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.163582 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.164503 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.164666 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.164959 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.167985 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.173594 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z726j"] Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.201973 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a549470e-be48-449d-b3e8-0caa23a23ee5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.202005 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a549470e-be48-449d-b3e8-0caa23a23ee5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.202061 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7hrf\" (UniqueName: \"kubernetes.io/projected/a549470e-be48-449d-b3e8-0caa23a23ee5-kube-api-access-s7hrf\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.272816 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.303096 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a549470e-be48-449d-b3e8-0caa23a23ee5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.303138 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a549470e-be48-449d-b3e8-0caa23a23ee5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.303209 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7hrf\" (UniqueName: \"kubernetes.io/projected/a549470e-be48-449d-b3e8-0caa23a23ee5-kube-api-access-s7hrf\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.304958 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a549470e-be48-449d-b3e8-0caa23a23ee5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.314786 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a549470e-be48-449d-b3e8-0caa23a23ee5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.317364 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7hrf\" (UniqueName: \"kubernetes.io/projected/a549470e-be48-449d-b3e8-0caa23a23ee5-kube-api-access-s7hrf\") pod \"marketplace-operator-79b997595-z726j\" (UID: \"a549470e-be48-449d-b3e8-0caa23a23ee5\") " pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.486149 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.727234 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.727575 4728 scope.go:117] "RemoveContainer" containerID="a97390983a8bb2e6b486b2a25ea92bbea6eea27ff1514fd0c13a1fa36ee792c8" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.727593 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.841864 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z726j"] Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.979480 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 25 05:43:02 crc kubenswrapper[4728]: I0125 05:43:02.979628 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.065430 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.095794 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.334462 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f3e115-0b8f-473d-9cb3-dea0a2685889" path="/var/lib/kubelet/pods/37f3e115-0b8f-473d-9cb3-dea0a2685889/volumes" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.335509 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55" path="/var/lib/kubelet/pods/63b46b12-a3f2-41fc-9d28-3c9a0dbc8b55/volumes" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.336095 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95303aa9-3fb0-48a8-8df8-0f601653ac48" path="/var/lib/kubelet/pods/95303aa9-3fb0-48a8-8df8-0f601653ac48/volumes" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.337054 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b59c4b5c-3733-4fed-8410-57526ce048b2" path="/var/lib/kubelet/pods/b59c4b5c-3733-4fed-8410-57526ce048b2/volumes" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.337715 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a82d6e-706c-4b85-81ca-9bf8fb99d904" path="/var/lib/kubelet/pods/d6a82d6e-706c-4b85-81ca-9bf8fb99d904/volumes" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.338593 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.338860 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.350946 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.350978 4728 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="efec0579-1ac5-4c07-830f-b962cfc5bc27" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.359633 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.359661 4728 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="efec0579-1ac5-4c07-830f-b962cfc5bc27" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.732993 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z726j" event={"ID":"a549470e-be48-449d-b3e8-0caa23a23ee5","Type":"ContainerStarted","Data":"be6a544db3859816de35ae542da54129dd159b081bba7bfd4ff9f097d9662d3e"} Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.733443 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.733457 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z726j" event={"ID":"a549470e-be48-449d-b3e8-0caa23a23ee5","Type":"ContainerStarted","Data":"a484264d9f647f2412cbdecf40a51873bd13d053f58cf2b155121a698a63c3a8"} Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.738303 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z726j" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.744940 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z726j" podStartSLOduration=3.7449267969999998 podStartE2EDuration="3.744926797s" podCreationTimestamp="2026-01-25 05:43:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:43:03.743228936 +0000 UTC m=+274.779106917" watchObservedRunningTime="2026-01-25 05:43:03.744926797 +0000 UTC m=+274.780804777" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.760910 4728 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.852397 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 25 05:43:03 crc kubenswrapper[4728]: I0125 05:43:03.948455 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 25 05:43:04 crc kubenswrapper[4728]: I0125 05:43:04.008555 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 25 05:43:04 crc kubenswrapper[4728]: I0125 05:43:04.379643 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 25 05:43:04 crc kubenswrapper[4728]: I0125 05:43:04.790465 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 25 05:43:05 crc kubenswrapper[4728]: I0125 05:43:05.409688 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.236259 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z8p7j"] Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.237086 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" podUID="24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" containerName="controller-manager" containerID="cri-o://bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2" gracePeriod=30 Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.340785 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872"] Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.341181 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" podUID="9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" containerName="route-controller-manager" containerID="cri-o://acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d" gracePeriod=30 Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.518602 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.602462 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-config\") pod \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.602868 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-serving-cert\") pod \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.602930 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln82q\" (UniqueName: \"kubernetes.io/projected/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-kube-api-access-ln82q\") pod \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.602958 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-proxy-ca-bundles\") pod \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.602980 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-client-ca\") pod \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\" (UID: \"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.603272 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-config" (OuterVolumeSpecName: "config") pod "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" (UID: "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.603372 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" (UID: "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.603663 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-client-ca" (OuterVolumeSpecName: "client-ca") pod "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" (UID: "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.603695 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.603835 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.608302 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" (UID: "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.618371 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-kube-api-access-ln82q" (OuterVolumeSpecName: "kube-api-access-ln82q") pod "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" (UID: "24af3f15-ccd7-4ff3-b8d5-1aff86c3a401"). InnerVolumeSpecName "kube-api-access-ln82q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.632165 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.704687 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-serving-cert\") pod \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.704773 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-config\") pod \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.704814 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc59d\" (UniqueName: \"kubernetes.io/projected/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-kube-api-access-mc59d\") pod \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.704839 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-client-ca\") pod \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\" (UID: \"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417\") " Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.705040 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.705059 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln82q\" (UniqueName: \"kubernetes.io/projected/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-kube-api-access-ln82q\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.705070 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401-client-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.705591 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" (UID: "9b496ceb-6f7d-4b10-8faa-b0fee2b9b417"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.705622 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-config" (OuterVolumeSpecName: "config") pod "9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" (UID: "9b496ceb-6f7d-4b10-8faa-b0fee2b9b417"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.708649 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-kube-api-access-mc59d" (OuterVolumeSpecName: "kube-api-access-mc59d") pod "9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" (UID: "9b496ceb-6f7d-4b10-8faa-b0fee2b9b417"). InnerVolumeSpecName "kube-api-access-mc59d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.708699 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" (UID: "9b496ceb-6f7d-4b10-8faa-b0fee2b9b417"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.805589 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc59d\" (UniqueName: \"kubernetes.io/projected/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-kube-api-access-mc59d\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.805619 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-client-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.805632 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.805642 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.811695 4728 generic.go:334] "Generic (PLEG): container finished" podID="24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" containerID="bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2" exitCode=0 Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.811778 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" event={"ID":"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401","Type":"ContainerDied","Data":"bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2"} Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.811784 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.811846 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z8p7j" event={"ID":"24af3f15-ccd7-4ff3-b8d5-1aff86c3a401","Type":"ContainerDied","Data":"df21f70529fbb031c7d8937ecb878fbe86d9d40b88954a65348b969a14181775"} Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.811874 4728 scope.go:117] "RemoveContainer" containerID="bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.813279 4728 generic.go:334] "Generic (PLEG): container finished" podID="9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" containerID="acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d" exitCode=0 Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.813338 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" event={"ID":"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417","Type":"ContainerDied","Data":"acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d"} Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.813363 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" event={"ID":"9b496ceb-6f7d-4b10-8faa-b0fee2b9b417","Type":"ContainerDied","Data":"eecdfc9ff88756a89a98eb4a7f632e0acead1b8ed5d748addf1cf1aea83bbd2e"} Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.813419 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.827272 4728 scope.go:117] "RemoveContainer" containerID="bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2" Jan 25 05:43:20 crc kubenswrapper[4728]: E0125 05:43:20.827788 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2\": container with ID starting with bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2 not found: ID does not exist" containerID="bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.827819 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2"} err="failed to get container status \"bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2\": rpc error: code = NotFound desc = could not find container \"bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2\": container with ID starting with bac428717bef6bc6ae9011aa8babf366fd6dcb99b15405e4fe21007c20b0ddf2 not found: ID does not exist" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.827841 4728 scope.go:117] "RemoveContainer" containerID="acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.836629 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872"] Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.839209 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8872"] Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.841580 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z8p7j"] Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.843593 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z8p7j"] Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.844400 4728 scope.go:117] "RemoveContainer" containerID="acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d" Jan 25 05:43:20 crc kubenswrapper[4728]: E0125 05:43:20.844776 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d\": container with ID starting with acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d not found: ID does not exist" containerID="acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d" Jan 25 05:43:20 crc kubenswrapper[4728]: I0125 05:43:20.844819 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d"} err="failed to get container status \"acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d\": rpc error: code = NotFound desc = could not find container \"acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d\": container with ID starting with acca94b83705e87f0b6992fd69a2791278a1610b965a473bfa1453fab7975d1d not found: ID does not exist" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.333668 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" path="/var/lib/kubelet/pods/24af3f15-ccd7-4ff3-b8d5-1aff86c3a401/volumes" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.335208 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" path="/var/lib/kubelet/pods/9b496ceb-6f7d-4b10-8faa-b0fee2b9b417/volumes" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.706152 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-588cccb9d7-bk55v"] Jan 25 05:43:21 crc kubenswrapper[4728]: E0125 05:43:21.706492 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" containerName="route-controller-manager" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.706509 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" containerName="route-controller-manager" Jan 25 05:43:21 crc kubenswrapper[4728]: E0125 05:43:21.706531 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" containerName="controller-manager" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.706542 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" containerName="controller-manager" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.706675 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b496ceb-6f7d-4b10-8faa-b0fee2b9b417" containerName="route-controller-manager" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.706687 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="24af3f15-ccd7-4ff3-b8d5-1aff86c3a401" containerName="controller-manager" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.707207 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.708371 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b"] Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.708956 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.709010 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.709117 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.709980 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.710012 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.711069 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.711509 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.711554 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.711789 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.711877 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.711987 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.712031 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.712623 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.717016 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.717132 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-588cccb9d7-bk55v"] Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.720608 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b"] Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815038 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-config\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815087 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-client-ca\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815120 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-client-ca\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815145 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-proxy-ca-bundles\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815211 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85c5618a-b180-45c5-adb8-453c8f404e14-serving-cert\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815315 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a714c74d-ff6c-47da-bb4e-1851115ba3ea-serving-cert\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815381 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v54b5\" (UniqueName: \"kubernetes.io/projected/a714c74d-ff6c-47da-bb4e-1851115ba3ea-kube-api-access-v54b5\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815428 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-config\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.815471 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tgk8\" (UniqueName: \"kubernetes.io/projected/85c5618a-b180-45c5-adb8-453c8f404e14-kube-api-access-7tgk8\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917111 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tgk8\" (UniqueName: \"kubernetes.io/projected/85c5618a-b180-45c5-adb8-453c8f404e14-kube-api-access-7tgk8\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917351 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-config\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917389 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-client-ca\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917421 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-client-ca\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917453 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-proxy-ca-bundles\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917470 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85c5618a-b180-45c5-adb8-453c8f404e14-serving-cert\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917504 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a714c74d-ff6c-47da-bb4e-1851115ba3ea-serving-cert\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917533 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v54b5\" (UniqueName: \"kubernetes.io/projected/a714c74d-ff6c-47da-bb4e-1851115ba3ea-kube-api-access-v54b5\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.917569 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-config\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.918436 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-client-ca\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.918468 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-client-ca\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.918580 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-config\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.919009 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-config\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.920476 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85c5618a-b180-45c5-adb8-453c8f404e14-serving-cert\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.921480 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a714c74d-ff6c-47da-bb4e-1851115ba3ea-serving-cert\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.922591 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c5618a-b180-45c5-adb8-453c8f404e14-proxy-ca-bundles\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.933253 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v54b5\" (UniqueName: \"kubernetes.io/projected/a714c74d-ff6c-47da-bb4e-1851115ba3ea-kube-api-access-v54b5\") pod \"route-controller-manager-588756b8c7-w6r6b\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:21 crc kubenswrapper[4728]: I0125 05:43:21.933449 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tgk8\" (UniqueName: \"kubernetes.io/projected/85c5618a-b180-45c5-adb8-453c8f404e14-kube-api-access-7tgk8\") pod \"controller-manager-588cccb9d7-bk55v\" (UID: \"85c5618a-b180-45c5-adb8-453c8f404e14\") " pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.020653 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.029808 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.358698 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b"] Jan 25 05:43:22 crc kubenswrapper[4728]: W0125 05:43:22.361977 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda714c74d_ff6c_47da_bb4e_1851115ba3ea.slice/crio-518987e0d7d82d21ebf34bee87555b2ff571ec5ac2ad8f74eab8ccb37529591d WatchSource:0}: Error finding container 518987e0d7d82d21ebf34bee87555b2ff571ec5ac2ad8f74eab8ccb37529591d: Status 404 returned error can't find the container with id 518987e0d7d82d21ebf34bee87555b2ff571ec5ac2ad8f74eab8ccb37529591d Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.392605 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-588cccb9d7-bk55v"] Jan 25 05:43:22 crc kubenswrapper[4728]: W0125 05:43:22.395657 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c5618a_b180_45c5_adb8_453c8f404e14.slice/crio-f364db71b48c1ed7cc03aa3c1c4d4485a4e7233293ed75731f389f44f13607d3 WatchSource:0}: Error finding container f364db71b48c1ed7cc03aa3c1c4d4485a4e7233293ed75731f389f44f13607d3: Status 404 returned error can't find the container with id f364db71b48c1ed7cc03aa3c1c4d4485a4e7233293ed75731f389f44f13607d3 Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.828748 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" event={"ID":"a714c74d-ff6c-47da-bb4e-1851115ba3ea","Type":"ContainerStarted","Data":"e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc"} Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.828788 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" event={"ID":"a714c74d-ff6c-47da-bb4e-1851115ba3ea","Type":"ContainerStarted","Data":"518987e0d7d82d21ebf34bee87555b2ff571ec5ac2ad8f74eab8ccb37529591d"} Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.829011 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.830767 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" event={"ID":"85c5618a-b180-45c5-adb8-453c8f404e14","Type":"ContainerStarted","Data":"3ef1aa7e10ff89292c6ee9faffcc6b3d7b4fea1e42833cd6d1c59afefcb74f0e"} Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.830807 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" event={"ID":"85c5618a-b180-45c5-adb8-453c8f404e14","Type":"ContainerStarted","Data":"f364db71b48c1ed7cc03aa3c1c4d4485a4e7233293ed75731f389f44f13607d3"} Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.830892 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.833944 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.844497 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" podStartSLOduration=2.844487408 podStartE2EDuration="2.844487408s" podCreationTimestamp="2026-01-25 05:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:43:22.842442143 +0000 UTC m=+293.878320113" watchObservedRunningTime="2026-01-25 05:43:22.844487408 +0000 UTC m=+293.880365388" Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.856840 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-588cccb9d7-bk55v" podStartSLOduration=2.856830031 podStartE2EDuration="2.856830031s" podCreationTimestamp="2026-01-25 05:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:43:22.854429116 +0000 UTC m=+293.890307096" watchObservedRunningTime="2026-01-25 05:43:22.856830031 +0000 UTC m=+293.892708011" Jan 25 05:43:22 crc kubenswrapper[4728]: I0125 05:43:22.950010 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:43:29 crc kubenswrapper[4728]: I0125 05:43:29.065939 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-phpgg"] Jan 25 05:43:29 crc kubenswrapper[4728]: I0125 05:43:29.222526 4728 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.675523 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pss7z"] Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.678142 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.680933 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.682230 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pss7z"] Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.753121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-catalog-content\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.753180 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmm2v\" (UniqueName: \"kubernetes.io/projected/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-kube-api-access-cmm2v\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.753227 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-utilities\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.854362 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-catalog-content\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.854419 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmm2v\" (UniqueName: \"kubernetes.io/projected/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-kube-api-access-cmm2v\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.854470 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-utilities\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.854853 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-utilities\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.855054 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-catalog-content\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.870185 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khmzg"] Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.871600 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.872561 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmm2v\" (UniqueName: \"kubernetes.io/projected/ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e-kube-api-access-cmm2v\") pod \"redhat-marketplace-pss7z\" (UID: \"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e\") " pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.873580 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.913220 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khmzg"] Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.956309 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad51ca6-3feb-4d91-b168-4330e2698fc1-catalog-content\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.956810 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad51ca6-3feb-4d91-b168-4330e2698fc1-utilities\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.957059 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx82g\" (UniqueName: \"kubernetes.io/projected/bad51ca6-3feb-4d91-b168-4330e2698fc1-kube-api-access-tx82g\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:41 crc kubenswrapper[4728]: I0125 05:43:41.992660 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.058463 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx82g\" (UniqueName: \"kubernetes.io/projected/bad51ca6-3feb-4d91-b168-4330e2698fc1-kube-api-access-tx82g\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.058537 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad51ca6-3feb-4d91-b168-4330e2698fc1-catalog-content\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.058587 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad51ca6-3feb-4d91-b168-4330e2698fc1-utilities\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.059766 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad51ca6-3feb-4d91-b168-4330e2698fc1-utilities\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.060061 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad51ca6-3feb-4d91-b168-4330e2698fc1-catalog-content\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.075715 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx82g\" (UniqueName: \"kubernetes.io/projected/bad51ca6-3feb-4d91-b168-4330e2698fc1-kube-api-access-tx82g\") pod \"redhat-operators-khmzg\" (UID: \"bad51ca6-3feb-4d91-b168-4330e2698fc1\") " pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.198752 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.346269 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pss7z"] Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.566362 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khmzg"] Jan 25 05:43:42 crc kubenswrapper[4728]: W0125 05:43:42.570421 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad51ca6_3feb_4d91_b168_4330e2698fc1.slice/crio-6bdfe3b3d98b1c301e63af197098adbd73aec639a9995b221e6b2725bfb1400f WatchSource:0}: Error finding container 6bdfe3b3d98b1c301e63af197098adbd73aec639a9995b221e6b2725bfb1400f: Status 404 returned error can't find the container with id 6bdfe3b3d98b1c301e63af197098adbd73aec639a9995b221e6b2725bfb1400f Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.918582 4728 generic.go:334] "Generic (PLEG): container finished" podID="ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e" containerID="451ee73582af7ab96090155eabd726ecada0c1cde176268c9a878fe2b71e5c5a" exitCode=0 Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.918633 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pss7z" event={"ID":"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e","Type":"ContainerDied","Data":"451ee73582af7ab96090155eabd726ecada0c1cde176268c9a878fe2b71e5c5a"} Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.918925 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pss7z" event={"ID":"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e","Type":"ContainerStarted","Data":"6646754902be2d8a57b0a9bc05f4dd6e9862fd4799b555a526fedd9cd6fe3d7f"} Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.920871 4728 generic.go:334] "Generic (PLEG): container finished" podID="bad51ca6-3feb-4d91-b168-4330e2698fc1" containerID="588fb61bc7933d55f7668313ef0033d0bfefef73a103f41bcaedd627046b66c6" exitCode=0 Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.920935 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khmzg" event={"ID":"bad51ca6-3feb-4d91-b168-4330e2698fc1","Type":"ContainerDied","Data":"588fb61bc7933d55f7668313ef0033d0bfefef73a103f41bcaedd627046b66c6"} Jan 25 05:43:42 crc kubenswrapper[4728]: I0125 05:43:42.921025 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khmzg" event={"ID":"bad51ca6-3feb-4d91-b168-4330e2698fc1","Type":"ContainerStarted","Data":"6bdfe3b3d98b1c301e63af197098adbd73aec639a9995b221e6b2725bfb1400f"} Jan 25 05:43:43 crc kubenswrapper[4728]: I0125 05:43:43.930180 4728 generic.go:334] "Generic (PLEG): container finished" podID="ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e" containerID="755f4e092efbb104baac6cc61dd0753688d62ad0d31a3e6a42f29ce3a2dd7098" exitCode=0 Jan 25 05:43:43 crc kubenswrapper[4728]: I0125 05:43:43.930361 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pss7z" event={"ID":"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e","Type":"ContainerDied","Data":"755f4e092efbb104baac6cc61dd0753688d62ad0d31a3e6a42f29ce3a2dd7098"} Jan 25 05:43:43 crc kubenswrapper[4728]: I0125 05:43:43.933212 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khmzg" event={"ID":"bad51ca6-3feb-4d91-b168-4330e2698fc1","Type":"ContainerStarted","Data":"56ed23fbc861223113a98ea87648bc1d83858777a01941829472190bfdac0dac"} Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.067920 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7hjgs"] Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.068797 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.070688 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.078249 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7hjgs"] Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.083429 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-catalog-content\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.083464 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgfcq\" (UniqueName: \"kubernetes.io/projected/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-kube-api-access-rgfcq\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.083554 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-utilities\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.183912 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-catalog-content\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.184090 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgfcq\" (UniqueName: \"kubernetes.io/projected/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-kube-api-access-rgfcq\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.184224 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-utilities\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.184314 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-catalog-content\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.184653 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-utilities\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.200040 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgfcq\" (UniqueName: \"kubernetes.io/projected/0d0a6d26-536c-4931-9aa7-803fe8bb55a3-kube-api-access-rgfcq\") pod \"certified-operators-7hjgs\" (UID: \"0d0a6d26-536c-4931-9aa7-803fe8bb55a3\") " pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.266456 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b255c"] Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.270906 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.275741 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.285440 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-catalog-content\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.285513 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-utilities\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.285564 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87gpn\" (UniqueName: \"kubernetes.io/projected/8b2f959d-89c4-43df-98a5-b8c37490dff7-kube-api-access-87gpn\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.291089 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b255c"] Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.381779 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.387403 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87gpn\" (UniqueName: \"kubernetes.io/projected/8b2f959d-89c4-43df-98a5-b8c37490dff7-kube-api-access-87gpn\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.387535 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-catalog-content\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.387635 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-utilities\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.388093 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-catalog-content\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.388343 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-utilities\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.402855 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87gpn\" (UniqueName: \"kubernetes.io/projected/8b2f959d-89c4-43df-98a5-b8c37490dff7-kube-api-access-87gpn\") pod \"community-operators-b255c\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.594532 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.739902 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7hjgs"] Jan 25 05:43:44 crc kubenswrapper[4728]: W0125 05:43:44.745822 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d0a6d26_536c_4931_9aa7_803fe8bb55a3.slice/crio-36848e54cd5a941df952630531fa400918da742b95bdf0264274bc89b69da396 WatchSource:0}: Error finding container 36848e54cd5a941df952630531fa400918da742b95bdf0264274bc89b69da396: Status 404 returned error can't find the container with id 36848e54cd5a941df952630531fa400918da742b95bdf0264274bc89b69da396 Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.938777 4728 generic.go:334] "Generic (PLEG): container finished" podID="bad51ca6-3feb-4d91-b168-4330e2698fc1" containerID="56ed23fbc861223113a98ea87648bc1d83858777a01941829472190bfdac0dac" exitCode=0 Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.938834 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khmzg" event={"ID":"bad51ca6-3feb-4d91-b168-4330e2698fc1","Type":"ContainerDied","Data":"56ed23fbc861223113a98ea87648bc1d83858777a01941829472190bfdac0dac"} Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.941881 4728 generic.go:334] "Generic (PLEG): container finished" podID="0d0a6d26-536c-4931-9aa7-803fe8bb55a3" containerID="c768c777627315dc9d57ef2d9104e694749c3c99de3ec32f56384dcf6abcb8b9" exitCode=0 Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.941946 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjgs" event={"ID":"0d0a6d26-536c-4931-9aa7-803fe8bb55a3","Type":"ContainerDied","Data":"c768c777627315dc9d57ef2d9104e694749c3c99de3ec32f56384dcf6abcb8b9"} Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.941975 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjgs" event={"ID":"0d0a6d26-536c-4931-9aa7-803fe8bb55a3","Type":"ContainerStarted","Data":"36848e54cd5a941df952630531fa400918da742b95bdf0264274bc89b69da396"} Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.945815 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pss7z" event={"ID":"ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e","Type":"ContainerStarted","Data":"99c98f2fff25efce8b4d7baa9614fa24a95e6fb7f5717a540f63eebcf3c0424b"} Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.954289 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b255c"] Jan 25 05:43:44 crc kubenswrapper[4728]: W0125 05:43:44.960995 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2f959d_89c4_43df_98a5_b8c37490dff7.slice/crio-64099e9c7eecee19c1a17622b89187559713cbe3f1879864d8e2e984ca0f567f WatchSource:0}: Error finding container 64099e9c7eecee19c1a17622b89187559713cbe3f1879864d8e2e984ca0f567f: Status 404 returned error can't find the container with id 64099e9c7eecee19c1a17622b89187559713cbe3f1879864d8e2e984ca0f567f Jan 25 05:43:44 crc kubenswrapper[4728]: I0125 05:43:44.985604 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pss7z" podStartSLOduration=2.453191906 podStartE2EDuration="3.985591235s" podCreationTimestamp="2026-01-25 05:43:41 +0000 UTC" firstStartedPulling="2026-01-25 05:43:42.919934492 +0000 UTC m=+313.955812471" lastFinishedPulling="2026-01-25 05:43:44.45233382 +0000 UTC m=+315.488211800" observedRunningTime="2026-01-25 05:43:44.985198105 +0000 UTC m=+316.021076085" watchObservedRunningTime="2026-01-25 05:43:44.985591235 +0000 UTC m=+316.021469225" Jan 25 05:43:45 crc kubenswrapper[4728]: I0125 05:43:45.951344 4728 generic.go:334] "Generic (PLEG): container finished" podID="0d0a6d26-536c-4931-9aa7-803fe8bb55a3" containerID="f0e029dd3239d89894c12a5fa55db718481961810b93a3e05dd6c7702e6916b0" exitCode=0 Jan 25 05:43:45 crc kubenswrapper[4728]: I0125 05:43:45.951520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjgs" event={"ID":"0d0a6d26-536c-4931-9aa7-803fe8bb55a3","Type":"ContainerDied","Data":"f0e029dd3239d89894c12a5fa55db718481961810b93a3e05dd6c7702e6916b0"} Jan 25 05:43:45 crc kubenswrapper[4728]: I0125 05:43:45.953527 4728 generic.go:334] "Generic (PLEG): container finished" podID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerID="c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa" exitCode=0 Jan 25 05:43:45 crc kubenswrapper[4728]: I0125 05:43:45.953569 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b255c" event={"ID":"8b2f959d-89c4-43df-98a5-b8c37490dff7","Type":"ContainerDied","Data":"c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa"} Jan 25 05:43:45 crc kubenswrapper[4728]: I0125 05:43:45.953588 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b255c" event={"ID":"8b2f959d-89c4-43df-98a5-b8c37490dff7","Type":"ContainerStarted","Data":"64099e9c7eecee19c1a17622b89187559713cbe3f1879864d8e2e984ca0f567f"} Jan 25 05:43:45 crc kubenswrapper[4728]: I0125 05:43:45.957715 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khmzg" event={"ID":"bad51ca6-3feb-4d91-b168-4330e2698fc1","Type":"ContainerStarted","Data":"b79db983dc0368e29e3183fd320de051730a553fcb5adc3cf627b88152898296"} Jan 25 05:43:45 crc kubenswrapper[4728]: I0125 05:43:45.985442 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khmzg" podStartSLOduration=2.504272105 podStartE2EDuration="4.985427758s" podCreationTimestamp="2026-01-25 05:43:41 +0000 UTC" firstStartedPulling="2026-01-25 05:43:42.92183255 +0000 UTC m=+313.957710529" lastFinishedPulling="2026-01-25 05:43:45.402988202 +0000 UTC m=+316.438866182" observedRunningTime="2026-01-25 05:43:45.982051545 +0000 UTC m=+317.017929526" watchObservedRunningTime="2026-01-25 05:43:45.985427758 +0000 UTC m=+317.021305739" Jan 25 05:43:46 crc kubenswrapper[4728]: I0125 05:43:46.966299 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hjgs" event={"ID":"0d0a6d26-536c-4931-9aa7-803fe8bb55a3","Type":"ContainerStarted","Data":"4695d6038e7284125b4ed22ff33ef96b9e8593204f095ed07c7fbbcdb72eb2f3"} Jan 25 05:43:46 crc kubenswrapper[4728]: I0125 05:43:46.968569 4728 generic.go:334] "Generic (PLEG): container finished" podID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerID="477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4" exitCode=0 Jan 25 05:43:46 crc kubenswrapper[4728]: I0125 05:43:46.968632 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b255c" event={"ID":"8b2f959d-89c4-43df-98a5-b8c37490dff7","Type":"ContainerDied","Data":"477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4"} Jan 25 05:43:46 crc kubenswrapper[4728]: I0125 05:43:46.985979 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7hjgs" podStartSLOduration=1.529916402 podStartE2EDuration="2.985968528s" podCreationTimestamp="2026-01-25 05:43:44 +0000 UTC" firstStartedPulling="2026-01-25 05:43:44.943140303 +0000 UTC m=+315.979018283" lastFinishedPulling="2026-01-25 05:43:46.399192419 +0000 UTC m=+317.435070409" observedRunningTime="2026-01-25 05:43:46.982462922 +0000 UTC m=+318.018340902" watchObservedRunningTime="2026-01-25 05:43:46.985968528 +0000 UTC m=+318.021846509" Jan 25 05:43:47 crc kubenswrapper[4728]: I0125 05:43:47.975966 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b255c" event={"ID":"8b2f959d-89c4-43df-98a5-b8c37490dff7","Type":"ContainerStarted","Data":"0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755"} Jan 25 05:43:47 crc kubenswrapper[4728]: I0125 05:43:47.993590 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b255c" podStartSLOduration=2.469713731 podStartE2EDuration="3.993561311s" podCreationTimestamp="2026-01-25 05:43:44 +0000 UTC" firstStartedPulling="2026-01-25 05:43:45.955271248 +0000 UTC m=+316.991149228" lastFinishedPulling="2026-01-25 05:43:47.479118828 +0000 UTC m=+318.514996808" observedRunningTime="2026-01-25 05:43:47.990266671 +0000 UTC m=+319.026144651" watchObservedRunningTime="2026-01-25 05:43:47.993561311 +0000 UTC m=+319.029439291" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.232145 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zhjcp"] Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.234107 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.254535 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zhjcp"] Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.282038 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.282084 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7t92\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-kube-api-access-v7t92\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.282113 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-trusted-ca\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.282137 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-bound-sa-token\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.282173 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-registry-certificates\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.282226 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.282265 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-registry-tls\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.282292 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.309062 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.383507 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.383565 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.383590 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7t92\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-kube-api-access-v7t92\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.383618 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-trusted-ca\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.383639 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-bound-sa-token\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.383668 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-registry-certificates\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.383729 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-registry-tls\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.384001 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.384710 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-trusted-ca\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.384882 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-registry-certificates\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.389932 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-registry-tls\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.396207 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.397093 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7t92\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-kube-api-access-v7t92\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.398119 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c67d511e-0752-4d9e-9bdb-6e9a5c50d29f-bound-sa-token\") pod \"image-registry-66df7c8f76-zhjcp\" (UID: \"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f\") " pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.547481 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.916971 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zhjcp"] Jan 25 05:43:51 crc kubenswrapper[4728]: W0125 05:43:51.921760 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67d511e_0752_4d9e_9bdb_6e9a5c50d29f.slice/crio-1dbe91396d315593a02cac077927a899d4d1dd214d0e09bf4d63a8721e2d19b3 WatchSource:0}: Error finding container 1dbe91396d315593a02cac077927a899d4d1dd214d0e09bf4d63a8721e2d19b3: Status 404 returned error can't find the container with id 1dbe91396d315593a02cac077927a899d4d1dd214d0e09bf4d63a8721e2d19b3 Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.992825 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.992869 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:51 crc kubenswrapper[4728]: I0125 05:43:51.996545 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" event={"ID":"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f","Type":"ContainerStarted","Data":"1dbe91396d315593a02cac077927a899d4d1dd214d0e09bf4d63a8721e2d19b3"} Jan 25 05:43:52 crc kubenswrapper[4728]: I0125 05:43:52.026887 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:52 crc kubenswrapper[4728]: I0125 05:43:52.199311 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:52 crc kubenswrapper[4728]: I0125 05:43:52.199797 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:52 crc kubenswrapper[4728]: I0125 05:43:52.227778 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:53 crc kubenswrapper[4728]: I0125 05:43:53.005422 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" event={"ID":"c67d511e-0752-4d9e-9bdb-6e9a5c50d29f","Type":"ContainerStarted","Data":"dfccf9f6cd413b9824ab6e6d4d37229bd3880236df5540faa5fdaab12da97f25"} Jan 25 05:43:53 crc kubenswrapper[4728]: I0125 05:43:53.006112 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:43:53 crc kubenswrapper[4728]: I0125 05:43:53.025608 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" podStartSLOduration=2.025593039 podStartE2EDuration="2.025593039s" podCreationTimestamp="2026-01-25 05:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:43:53.023532286 +0000 UTC m=+324.059410265" watchObservedRunningTime="2026-01-25 05:43:53.025593039 +0000 UTC m=+324.061471019" Jan 25 05:43:53 crc kubenswrapper[4728]: I0125 05:43:53.034149 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pss7z" Jan 25 05:43:53 crc kubenswrapper[4728]: I0125 05:43:53.044483 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khmzg" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.088915 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" podUID="fb6303d1-5e9c-41a4-8923-5ea2ed774af8" containerName="oauth-openshift" containerID="cri-o://5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87" gracePeriod=15 Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.382111 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.382491 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.418100 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.463994 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.490526 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-d46df4d5f-8p7kj"] Jan 25 05:43:54 crc kubenswrapper[4728]: E0125 05:43:54.490833 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6303d1-5e9c-41a4-8923-5ea2ed774af8" containerName="oauth-openshift" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.490853 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6303d1-5e9c-41a4-8923-5ea2ed774af8" containerName="oauth-openshift" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.490962 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6303d1-5e9c-41a4-8923-5ea2ed774af8" containerName="oauth-openshift" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.491509 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.511032 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d46df4d5f-8p7kj"] Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.595681 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.595727 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.630915 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636098 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7pfh\" (UniqueName: \"kubernetes.io/projected/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-kube-api-access-q7pfh\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636186 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-dir\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636214 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-session\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636236 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-provider-selection\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636267 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-serving-cert\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636286 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-error\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636313 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-service-ca\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636346 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-policies\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636362 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-login\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636357 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636432 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-router-certs\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636450 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-ocp-branding-template\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636513 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-cliconfig\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636559 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-idp-0-file-data\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636619 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-trusted-ca-bundle\") pod \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\" (UID: \"fb6303d1-5e9c-41a4-8923-5ea2ed774af8\") " Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636807 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636836 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-session\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636873 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636893 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636921 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1848c1d-4b14-490e-95cb-9d142046d994-audit-dir\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636938 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636961 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-service-ca\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.636992 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-router-certs\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637012 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637035 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-login\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637054 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-audit-policies\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637073 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637092 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-error\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637107 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfn5c\" (UniqueName: \"kubernetes.io/projected/b1848c1d-4b14-490e-95cb-9d142046d994-kube-api-access-vfn5c\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637138 4728 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637296 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637449 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637763 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.637384 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.643057 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.643289 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.643390 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.643655 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.643886 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.644957 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.645012 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-kube-api-access-q7pfh" (OuterVolumeSpecName: "kube-api-access-q7pfh") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "kube-api-access-q7pfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.648928 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.649400 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fb6303d1-5e9c-41a4-8923-5ea2ed774af8" (UID: "fb6303d1-5e9c-41a4-8923-5ea2ed774af8"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.737933 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1848c1d-4b14-490e-95cb-9d142046d994-audit-dir\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.737969 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.737994 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-service-ca\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738010 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1848c1d-4b14-490e-95cb-9d142046d994-audit-dir\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738022 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-router-certs\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738111 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738155 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-login\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738176 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-audit-policies\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738228 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738272 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-error\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738294 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfn5c\" (UniqueName: \"kubernetes.io/projected/b1848c1d-4b14-490e-95cb-9d142046d994-kube-api-access-vfn5c\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738386 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738417 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-session\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738475 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738503 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738590 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738606 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7pfh\" (UniqueName: \"kubernetes.io/projected/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-kube-api-access-q7pfh\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738622 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738637 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738651 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738662 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738675 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738687 4728 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738699 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738714 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738726 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738739 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.738753 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb6303d1-5e9c-41a4-8923-5ea2ed774af8-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.739295 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-audit-policies\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.739408 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.739899 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-service-ca\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.741444 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-router-certs\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.742076 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-error\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.742154 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.742269 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.742348 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.742430 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-session\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.742822 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.743508 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-user-template-login\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.743920 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1848c1d-4b14-490e-95cb-9d142046d994-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.753915 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfn5c\" (UniqueName: \"kubernetes.io/projected/b1848c1d-4b14-490e-95cb-9d142046d994-kube-api-access-vfn5c\") pod \"oauth-openshift-d46df4d5f-8p7kj\" (UID: \"b1848c1d-4b14-490e-95cb-9d142046d994\") " pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:54 crc kubenswrapper[4728]: I0125 05:43:54.813890 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.015034 4728 generic.go:334] "Generic (PLEG): container finished" podID="fb6303d1-5e9c-41a4-8923-5ea2ed774af8" containerID="5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87" exitCode=0 Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.015252 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" event={"ID":"fb6303d1-5e9c-41a4-8923-5ea2ed774af8","Type":"ContainerDied","Data":"5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87"} Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.015436 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" event={"ID":"fb6303d1-5e9c-41a4-8923-5ea2ed774af8","Type":"ContainerDied","Data":"f00bdf1e60cac46d27f8cff20e1eea86db902f00a3fc9e6022b1b69d0cded71f"} Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.015459 4728 scope.go:117] "RemoveContainer" containerID="5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87" Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.015351 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76f84477b-phpgg" Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.028413 4728 scope.go:117] "RemoveContainer" containerID="5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87" Jan 25 05:43:55 crc kubenswrapper[4728]: E0125 05:43:55.028695 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87\": container with ID starting with 5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87 not found: ID does not exist" containerID="5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87" Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.028734 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87"} err="failed to get container status \"5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87\": rpc error: code = NotFound desc = could not find container \"5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87\": container with ID starting with 5c89d4aaca2248bea1fa0346159555a2b4b8fc7c5396c9ab7c6f2bc22ad89d87 not found: ID does not exist" Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.042344 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-phpgg"] Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.047770 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7hjgs" Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.050357 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b255c" Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.050630 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-phpgg"] Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.171253 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d46df4d5f-8p7kj"] Jan 25 05:43:55 crc kubenswrapper[4728]: W0125 05:43:55.174627 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1848c1d_4b14_490e_95cb_9d142046d994.slice/crio-b241c4221601df9697b0d527a1704ca21776a3444128d5bf0ffdb76507d52546 WatchSource:0}: Error finding container b241c4221601df9697b0d527a1704ca21776a3444128d5bf0ffdb76507d52546: Status 404 returned error can't find the container with id b241c4221601df9697b0d527a1704ca21776a3444128d5bf0ffdb76507d52546 Jan 25 05:43:55 crc kubenswrapper[4728]: I0125 05:43:55.337150 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb6303d1-5e9c-41a4-8923-5ea2ed774af8" path="/var/lib/kubelet/pods/fb6303d1-5e9c-41a4-8923-5ea2ed774af8/volumes" Jan 25 05:43:56 crc kubenswrapper[4728]: I0125 05:43:56.025236 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" event={"ID":"b1848c1d-4b14-490e-95cb-9d142046d994","Type":"ContainerStarted","Data":"dda72bfc798d1b7f3e67064db512fff7f8c5d4bc6bacd167ca977602b7fb65ab"} Jan 25 05:43:56 crc kubenswrapper[4728]: I0125 05:43:56.025538 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" event={"ID":"b1848c1d-4b14-490e-95cb-9d142046d994","Type":"ContainerStarted","Data":"b241c4221601df9697b0d527a1704ca21776a3444128d5bf0ffdb76507d52546"} Jan 25 05:43:56 crc kubenswrapper[4728]: I0125 05:43:56.025557 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:56 crc kubenswrapper[4728]: I0125 05:43:56.040576 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" Jan 25 05:43:56 crc kubenswrapper[4728]: I0125 05:43:56.044129 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-d46df4d5f-8p7kj" podStartSLOduration=27.044119621 podStartE2EDuration="27.044119621s" podCreationTimestamp="2026-01-25 05:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:43:56.042905252 +0000 UTC m=+327.078783233" watchObservedRunningTime="2026-01-25 05:43:56.044119621 +0000 UTC m=+327.079997601" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.231456 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b"] Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.232897 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" podUID="a714c74d-ff6c-47da-bb4e-1851115ba3ea" containerName="route-controller-manager" containerID="cri-o://e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc" gracePeriod=30 Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.631957 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.819128 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-config\") pod \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.819199 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-client-ca\") pod \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.819281 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a714c74d-ff6c-47da-bb4e-1851115ba3ea-serving-cert\") pod \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.819349 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v54b5\" (UniqueName: \"kubernetes.io/projected/a714c74d-ff6c-47da-bb4e-1851115ba3ea-kube-api-access-v54b5\") pod \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\" (UID: \"a714c74d-ff6c-47da-bb4e-1851115ba3ea\") " Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.821314 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-client-ca" (OuterVolumeSpecName: "client-ca") pod "a714c74d-ff6c-47da-bb4e-1851115ba3ea" (UID: "a714c74d-ff6c-47da-bb4e-1851115ba3ea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.821357 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-config" (OuterVolumeSpecName: "config") pod "a714c74d-ff6c-47da-bb4e-1851115ba3ea" (UID: "a714c74d-ff6c-47da-bb4e-1851115ba3ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.826216 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a714c74d-ff6c-47da-bb4e-1851115ba3ea-kube-api-access-v54b5" (OuterVolumeSpecName: "kube-api-access-v54b5") pod "a714c74d-ff6c-47da-bb4e-1851115ba3ea" (UID: "a714c74d-ff6c-47da-bb4e-1851115ba3ea"). InnerVolumeSpecName "kube-api-access-v54b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.826502 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a714c74d-ff6c-47da-bb4e-1851115ba3ea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a714c74d-ff6c-47da-bb4e-1851115ba3ea" (UID: "a714c74d-ff6c-47da-bb4e-1851115ba3ea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.920571 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-client-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.920600 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a714c74d-ff6c-47da-bb4e-1851115ba3ea-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.920615 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v54b5\" (UniqueName: \"kubernetes.io/projected/a714c74d-ff6c-47da-bb4e-1851115ba3ea-kube-api-access-v54b5\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:00 crc kubenswrapper[4728]: I0125 05:44:00.920626 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a714c74d-ff6c-47da-bb4e-1851115ba3ea-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.058370 4728 generic.go:334] "Generic (PLEG): container finished" podID="a714c74d-ff6c-47da-bb4e-1851115ba3ea" containerID="e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc" exitCode=0 Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.058432 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" event={"ID":"a714c74d-ff6c-47da-bb4e-1851115ba3ea","Type":"ContainerDied","Data":"e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc"} Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.058472 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" event={"ID":"a714c74d-ff6c-47da-bb4e-1851115ba3ea","Type":"ContainerDied","Data":"518987e0d7d82d21ebf34bee87555b2ff571ec5ac2ad8f74eab8ccb37529591d"} Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.058495 4728 scope.go:117] "RemoveContainer" containerID="e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.058647 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.088873 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b"] Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.089315 4728 scope.go:117] "RemoveContainer" containerID="e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc" Jan 25 05:44:01 crc kubenswrapper[4728]: E0125 05:44:01.089802 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc\": container with ID starting with e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc not found: ID does not exist" containerID="e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.089843 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc"} err="failed to get container status \"e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc\": rpc error: code = NotFound desc = could not find container \"e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc\": container with ID starting with e811398862b2ddf4866e94142fcd28efee09a0b127c6faab4ef85277aed1cddc not found: ID does not exist" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.095506 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588756b8c7-w6r6b"] Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.337841 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a714c74d-ff6c-47da-bb4e-1851115ba3ea" path="/var/lib/kubelet/pods/a714c74d-ff6c-47da-bb4e-1851115ba3ea/volumes" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.739956 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872"] Jan 25 05:44:01 crc kubenswrapper[4728]: E0125 05:44:01.740315 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a714c74d-ff6c-47da-bb4e-1851115ba3ea" containerName="route-controller-manager" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.740359 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a714c74d-ff6c-47da-bb4e-1851115ba3ea" containerName="route-controller-manager" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.740520 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a714c74d-ff6c-47da-bb4e-1851115ba3ea" containerName="route-controller-manager" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.748115 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872"] Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.748222 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.752181 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.752474 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.752855 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.753131 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.753347 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.753550 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.835628 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22effd18-e758-451d-8d99-fe7c1dda2bf6-client-ca\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.835761 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7x85\" (UniqueName: \"kubernetes.io/projected/22effd18-e758-451d-8d99-fe7c1dda2bf6-kube-api-access-b7x85\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.835868 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22effd18-e758-451d-8d99-fe7c1dda2bf6-serving-cert\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.835972 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22effd18-e758-451d-8d99-fe7c1dda2bf6-config\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.936556 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22effd18-e758-451d-8d99-fe7c1dda2bf6-client-ca\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.936606 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7x85\" (UniqueName: \"kubernetes.io/projected/22effd18-e758-451d-8d99-fe7c1dda2bf6-kube-api-access-b7x85\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.936638 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22effd18-e758-451d-8d99-fe7c1dda2bf6-serving-cert\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.936675 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22effd18-e758-451d-8d99-fe7c1dda2bf6-config\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.937687 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22effd18-e758-451d-8d99-fe7c1dda2bf6-client-ca\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.939235 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22effd18-e758-451d-8d99-fe7c1dda2bf6-config\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.941028 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22effd18-e758-451d-8d99-fe7c1dda2bf6-serving-cert\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:01 crc kubenswrapper[4728]: I0125 05:44:01.952720 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7x85\" (UniqueName: \"kubernetes.io/projected/22effd18-e758-451d-8d99-fe7c1dda2bf6-kube-api-access-b7x85\") pod \"route-controller-manager-fd44dd559-fs872\" (UID: \"22effd18-e758-451d-8d99-fe7c1dda2bf6\") " pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:02 crc kubenswrapper[4728]: I0125 05:44:02.064081 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:02 crc kubenswrapper[4728]: I0125 05:44:02.446112 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872"] Jan 25 05:44:03 crc kubenswrapper[4728]: I0125 05:44:03.084168 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" event={"ID":"22effd18-e758-451d-8d99-fe7c1dda2bf6","Type":"ContainerStarted","Data":"3db3700c679bf23d27ae038ba2dde7992e582357bbae0d09b34a2af5ac7964f6"} Jan 25 05:44:03 crc kubenswrapper[4728]: I0125 05:44:03.084365 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" event={"ID":"22effd18-e758-451d-8d99-fe7c1dda2bf6","Type":"ContainerStarted","Data":"a9afd84bf870ca8a4f01e1927eca2424b0c32611df556dc25127f21a9e49b01a"} Jan 25 05:44:03 crc kubenswrapper[4728]: I0125 05:44:03.084765 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:03 crc kubenswrapper[4728]: I0125 05:44:03.092839 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" Jan 25 05:44:03 crc kubenswrapper[4728]: I0125 05:44:03.100437 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fd44dd559-fs872" podStartSLOduration=3.100422911 podStartE2EDuration="3.100422911s" podCreationTimestamp="2026-01-25 05:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:44:03.099379244 +0000 UTC m=+334.135257214" watchObservedRunningTime="2026-01-25 05:44:03.100422911 +0000 UTC m=+334.136300891" Jan 25 05:44:11 crc kubenswrapper[4728]: I0125 05:44:11.553503 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-zhjcp" Jan 25 05:44:11 crc kubenswrapper[4728]: I0125 05:44:11.610211 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22q6g"] Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.641457 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" podUID="fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" containerName="registry" containerID="cri-o://11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5" gracePeriod=30 Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.955300 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.970639 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-certificates\") pod \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.970826 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-ca-trust-extracted\") pod \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.970982 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-trusted-ca\") pod \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.971018 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-bound-sa-token\") pod \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.971308 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.971377 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-tls\") pod \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.971453 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.971509 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-installation-pull-secrets\") pod \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.971547 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvwk6\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-kube-api-access-gvwk6\") pod \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\" (UID: \"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1\") " Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.972015 4728 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.973097 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.978732 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.978926 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.982881 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-kube-api-access-gvwk6" (OuterVolumeSpecName: "kube-api-access-gvwk6") pod "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1"). InnerVolumeSpecName "kube-api-access-gvwk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.984851 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.986067 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 25 05:44:36 crc kubenswrapper[4728]: I0125 05:44:36.990273 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" (UID: "fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.072730 4728 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.072766 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.072776 4728 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.072788 4728 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.072798 4728 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.072814 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvwk6\" (UniqueName: \"kubernetes.io/projected/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1-kube-api-access-gvwk6\") on node \"crc\" DevicePath \"\"" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.268241 4728 generic.go:334] "Generic (PLEG): container finished" podID="fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" containerID="11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5" exitCode=0 Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.268294 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" event={"ID":"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1","Type":"ContainerDied","Data":"11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5"} Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.268355 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.268371 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-22q6g" event={"ID":"fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1","Type":"ContainerDied","Data":"3840de3c61125411d59ed4c25c5e9f57cd168bb37a3765266e68851002de2788"} Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.268403 4728 scope.go:117] "RemoveContainer" containerID="11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.283269 4728 scope.go:117] "RemoveContainer" containerID="11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5" Jan 25 05:44:37 crc kubenswrapper[4728]: E0125 05:44:37.284188 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5\": container with ID starting with 11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5 not found: ID does not exist" containerID="11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.284225 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5"} err="failed to get container status \"11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5\": rpc error: code = NotFound desc = could not find container \"11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5\": container with ID starting with 11a5effecebc894c5bac0f1104358f2c716aaa8c880df7e6e20099a30923cfe5 not found: ID does not exist" Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.293499 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22q6g"] Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.296034 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-22q6g"] Jan 25 05:44:37 crc kubenswrapper[4728]: I0125 05:44:37.335098 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" path="/var/lib/kubelet/pods/fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1/volumes" Jan 25 05:44:42 crc kubenswrapper[4728]: I0125 05:44:42.899405 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:44:42 crc kubenswrapper[4728]: I0125 05:44:42.899818 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.155455 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz"] Jan 25 05:45:00 crc kubenswrapper[4728]: E0125 05:45:00.156227 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" containerName="registry" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.156245 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" containerName="registry" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.156367 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5ba2c4-eaeb-4620-b71b-7bb45ad640d1" containerName="registry" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.156808 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.158240 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.159341 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.161515 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz"] Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.212951 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxq79\" (UniqueName: \"kubernetes.io/projected/022c3734-c432-4ba6-9e9b-19fcedd5db9c-kube-api-access-zxq79\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.213003 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/022c3734-c432-4ba6-9e9b-19fcedd5db9c-secret-volume\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.213073 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/022c3734-c432-4ba6-9e9b-19fcedd5db9c-config-volume\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.314199 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/022c3734-c432-4ba6-9e9b-19fcedd5db9c-config-volume\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.314381 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxq79\" (UniqueName: \"kubernetes.io/projected/022c3734-c432-4ba6-9e9b-19fcedd5db9c-kube-api-access-zxq79\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.314429 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/022c3734-c432-4ba6-9e9b-19fcedd5db9c-secret-volume\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.315121 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/022c3734-c432-4ba6-9e9b-19fcedd5db9c-config-volume\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.319734 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/022c3734-c432-4ba6-9e9b-19fcedd5db9c-secret-volume\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.329715 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxq79\" (UniqueName: \"kubernetes.io/projected/022c3734-c432-4ba6-9e9b-19fcedd5db9c-kube-api-access-zxq79\") pod \"collect-profiles-29488665-blqfz\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.469611 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:00 crc kubenswrapper[4728]: I0125 05:45:00.821219 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz"] Jan 25 05:45:01 crc kubenswrapper[4728]: I0125 05:45:01.418314 4728 generic.go:334] "Generic (PLEG): container finished" podID="022c3734-c432-4ba6-9e9b-19fcedd5db9c" containerID="bd855b185c4d35baade53a1731370fdcca7a90d2b6960f6fe567f6dcc901311d" exitCode=0 Jan 25 05:45:01 crc kubenswrapper[4728]: I0125 05:45:01.418371 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" event={"ID":"022c3734-c432-4ba6-9e9b-19fcedd5db9c","Type":"ContainerDied","Data":"bd855b185c4d35baade53a1731370fdcca7a90d2b6960f6fe567f6dcc901311d"} Jan 25 05:45:01 crc kubenswrapper[4728]: I0125 05:45:01.418394 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" event={"ID":"022c3734-c432-4ba6-9e9b-19fcedd5db9c","Type":"ContainerStarted","Data":"264c817bc64a1ce2274528b8e9831679ac1a80330b8283c64c1f404dc1d6e39e"} Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.625083 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.639684 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/022c3734-c432-4ba6-9e9b-19fcedd5db9c-config-volume\") pod \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.639725 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/022c3734-c432-4ba6-9e9b-19fcedd5db9c-secret-volume\") pod \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.639771 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxq79\" (UniqueName: \"kubernetes.io/projected/022c3734-c432-4ba6-9e9b-19fcedd5db9c-kube-api-access-zxq79\") pod \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\" (UID: \"022c3734-c432-4ba6-9e9b-19fcedd5db9c\") " Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.640624 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/022c3734-c432-4ba6-9e9b-19fcedd5db9c-config-volume" (OuterVolumeSpecName: "config-volume") pod "022c3734-c432-4ba6-9e9b-19fcedd5db9c" (UID: "022c3734-c432-4ba6-9e9b-19fcedd5db9c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.645982 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022c3734-c432-4ba6-9e9b-19fcedd5db9c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "022c3734-c432-4ba6-9e9b-19fcedd5db9c" (UID: "022c3734-c432-4ba6-9e9b-19fcedd5db9c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.646332 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/022c3734-c432-4ba6-9e9b-19fcedd5db9c-kube-api-access-zxq79" (OuterVolumeSpecName: "kube-api-access-zxq79") pod "022c3734-c432-4ba6-9e9b-19fcedd5db9c" (UID: "022c3734-c432-4ba6-9e9b-19fcedd5db9c"). InnerVolumeSpecName "kube-api-access-zxq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.741290 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/022c3734-c432-4ba6-9e9b-19fcedd5db9c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.741338 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/022c3734-c432-4ba6-9e9b-19fcedd5db9c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 25 05:45:02 crc kubenswrapper[4728]: I0125 05:45:02.741361 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxq79\" (UniqueName: \"kubernetes.io/projected/022c3734-c432-4ba6-9e9b-19fcedd5db9c-kube-api-access-zxq79\") on node \"crc\" DevicePath \"\"" Jan 25 05:45:03 crc kubenswrapper[4728]: I0125 05:45:03.429871 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" event={"ID":"022c3734-c432-4ba6-9e9b-19fcedd5db9c","Type":"ContainerDied","Data":"264c817bc64a1ce2274528b8e9831679ac1a80330b8283c64c1f404dc1d6e39e"} Jan 25 05:45:03 crc kubenswrapper[4728]: I0125 05:45:03.430242 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="264c817bc64a1ce2274528b8e9831679ac1a80330b8283c64c1f404dc1d6e39e" Jan 25 05:45:03 crc kubenswrapper[4728]: I0125 05:45:03.429932 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz" Jan 25 05:45:12 crc kubenswrapper[4728]: I0125 05:45:12.898958 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:45:12 crc kubenswrapper[4728]: I0125 05:45:12.899261 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:45:42 crc kubenswrapper[4728]: I0125 05:45:42.899558 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:45:42 crc kubenswrapper[4728]: I0125 05:45:42.900895 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:45:42 crc kubenswrapper[4728]: I0125 05:45:42.900967 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:45:42 crc kubenswrapper[4728]: I0125 05:45:42.901562 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c6bd49d9b17f994e00e405d6b8f16b6edd37de171ad4d27462fcbdcfc065a69"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 05:45:42 crc kubenswrapper[4728]: I0125 05:45:42.901633 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://9c6bd49d9b17f994e00e405d6b8f16b6edd37de171ad4d27462fcbdcfc065a69" gracePeriod=600 Jan 25 05:45:43 crc kubenswrapper[4728]: I0125 05:45:43.632506 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="9c6bd49d9b17f994e00e405d6b8f16b6edd37de171ad4d27462fcbdcfc065a69" exitCode=0 Jan 25 05:45:43 crc kubenswrapper[4728]: I0125 05:45:43.632588 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"9c6bd49d9b17f994e00e405d6b8f16b6edd37de171ad4d27462fcbdcfc065a69"} Jan 25 05:45:43 crc kubenswrapper[4728]: I0125 05:45:43.633116 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"4869a4031b431cc23a01935621e0bf0cd63107d4d6edf2fef74234f9435dad57"} Jan 25 05:45:43 crc kubenswrapper[4728]: I0125 05:45:43.633156 4728 scope.go:117] "RemoveContainer" containerID="0deee2df103ba589fbc0e9893b27c50920332fe2eee95f70ab89ed780903ac24" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.844044 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-98wvg"] Jan 25 05:47:27 crc kubenswrapper[4728]: E0125 05:47:27.844689 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="022c3734-c432-4ba6-9e9b-19fcedd5db9c" containerName="collect-profiles" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.844702 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="022c3734-c432-4ba6-9e9b-19fcedd5db9c" containerName="collect-profiles" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.844792 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="022c3734-c432-4ba6-9e9b-19fcedd5db9c" containerName="collect-profiles" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.845147 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-98wvg" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.848660 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.848666 4728 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zmlkl" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.851810 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.859999 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-c5nfr"] Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.860464 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c5nfr" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.864034 4728 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-5cjgw" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.867408 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-98wvg"] Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.874228 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-mm5zm"] Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.874678 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.875982 4728 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l2n82" Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.876716 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c5nfr"] Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.911566 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-mm5zm"] Jan 25 05:47:27 crc kubenswrapper[4728]: I0125 05:47:27.955441 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fffqn\" (UniqueName: \"kubernetes.io/projected/c89a03e4-cc67-408c-93f8-7c0972ac36a8-kube-api-access-fffqn\") pod \"cert-manager-cainjector-cf98fcc89-98wvg\" (UID: \"c89a03e4-cc67-408c-93f8-7c0972ac36a8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-98wvg" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.057481 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fffqn\" (UniqueName: \"kubernetes.io/projected/c89a03e4-cc67-408c-93f8-7c0972ac36a8-kube-api-access-fffqn\") pod \"cert-manager-cainjector-cf98fcc89-98wvg\" (UID: \"c89a03e4-cc67-408c-93f8-7c0972ac36a8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-98wvg" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.057630 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75j9v\" (UniqueName: \"kubernetes.io/projected/8060ef0d-4977-4b40-a26c-bded7ccbe72e-kube-api-access-75j9v\") pod \"cert-manager-858654f9db-c5nfr\" (UID: \"8060ef0d-4977-4b40-a26c-bded7ccbe72e\") " pod="cert-manager/cert-manager-858654f9db-c5nfr" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.057664 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfr9\" (UniqueName: \"kubernetes.io/projected/e7eaed33-a3a8-45fd-b1be-9bec59f65967-kube-api-access-wlfr9\") pod \"cert-manager-webhook-687f57d79b-mm5zm\" (UID: \"e7eaed33-a3a8-45fd-b1be-9bec59f65967\") " pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.073426 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fffqn\" (UniqueName: \"kubernetes.io/projected/c89a03e4-cc67-408c-93f8-7c0972ac36a8-kube-api-access-fffqn\") pod \"cert-manager-cainjector-cf98fcc89-98wvg\" (UID: \"c89a03e4-cc67-408c-93f8-7c0972ac36a8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-98wvg" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.157218 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-98wvg" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.158645 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75j9v\" (UniqueName: \"kubernetes.io/projected/8060ef0d-4977-4b40-a26c-bded7ccbe72e-kube-api-access-75j9v\") pod \"cert-manager-858654f9db-c5nfr\" (UID: \"8060ef0d-4977-4b40-a26c-bded7ccbe72e\") " pod="cert-manager/cert-manager-858654f9db-c5nfr" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.158713 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfr9\" (UniqueName: \"kubernetes.io/projected/e7eaed33-a3a8-45fd-b1be-9bec59f65967-kube-api-access-wlfr9\") pod \"cert-manager-webhook-687f57d79b-mm5zm\" (UID: \"e7eaed33-a3a8-45fd-b1be-9bec59f65967\") " pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.173666 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfr9\" (UniqueName: \"kubernetes.io/projected/e7eaed33-a3a8-45fd-b1be-9bec59f65967-kube-api-access-wlfr9\") pod \"cert-manager-webhook-687f57d79b-mm5zm\" (UID: \"e7eaed33-a3a8-45fd-b1be-9bec59f65967\") " pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.174355 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75j9v\" (UniqueName: \"kubernetes.io/projected/8060ef0d-4977-4b40-a26c-bded7ccbe72e-kube-api-access-75j9v\") pod \"cert-manager-858654f9db-c5nfr\" (UID: \"8060ef0d-4977-4b40-a26c-bded7ccbe72e\") " pod="cert-manager/cert-manager-858654f9db-c5nfr" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.185750 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.470333 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c5nfr" Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.513569 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-98wvg"] Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.523900 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.562286 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-mm5zm"] Jan 25 05:47:28 crc kubenswrapper[4728]: W0125 05:47:28.573797 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7eaed33_a3a8_45fd_b1be_9bec59f65967.slice/crio-0c993bfa8425ba0a9601316d2bbeb142e3477ea1cb5dd416aebf0b51bfbd8dad WatchSource:0}: Error finding container 0c993bfa8425ba0a9601316d2bbeb142e3477ea1cb5dd416aebf0b51bfbd8dad: Status 404 returned error can't find the container with id 0c993bfa8425ba0a9601316d2bbeb142e3477ea1cb5dd416aebf0b51bfbd8dad Jan 25 05:47:28 crc kubenswrapper[4728]: I0125 05:47:28.632088 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c5nfr"] Jan 25 05:47:28 crc kubenswrapper[4728]: W0125 05:47:28.636421 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8060ef0d_4977_4b40_a26c_bded7ccbe72e.slice/crio-5adc84882aa7b3df5f96c9cce65cc121e7e450a8cf0b2cb831fc69bcdf0f8546 WatchSource:0}: Error finding container 5adc84882aa7b3df5f96c9cce65cc121e7e450a8cf0b2cb831fc69bcdf0f8546: Status 404 returned error can't find the container with id 5adc84882aa7b3df5f96c9cce65cc121e7e450a8cf0b2cb831fc69bcdf0f8546 Jan 25 05:47:29 crc kubenswrapper[4728]: I0125 05:47:29.176949 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c5nfr" event={"ID":"8060ef0d-4977-4b40-a26c-bded7ccbe72e","Type":"ContainerStarted","Data":"5adc84882aa7b3df5f96c9cce65cc121e7e450a8cf0b2cb831fc69bcdf0f8546"} Jan 25 05:47:29 crc kubenswrapper[4728]: I0125 05:47:29.178440 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" event={"ID":"e7eaed33-a3a8-45fd-b1be-9bec59f65967","Type":"ContainerStarted","Data":"0c993bfa8425ba0a9601316d2bbeb142e3477ea1cb5dd416aebf0b51bfbd8dad"} Jan 25 05:47:29 crc kubenswrapper[4728]: I0125 05:47:29.179736 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-98wvg" event={"ID":"c89a03e4-cc67-408c-93f8-7c0972ac36a8","Type":"ContainerStarted","Data":"2fe256cb7320d02f5d55851ddd91a66b3ed4482414ed8aa98a8d092229af3c43"} Jan 25 05:47:32 crc kubenswrapper[4728]: I0125 05:47:32.197162 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-98wvg" event={"ID":"c89a03e4-cc67-408c-93f8-7c0972ac36a8","Type":"ContainerStarted","Data":"7661823d4a4e4683f7cbd18a3d786d7b0f4dac492f5839e5c30f4a478335e140"} Jan 25 05:47:32 crc kubenswrapper[4728]: I0125 05:47:32.198622 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c5nfr" event={"ID":"8060ef0d-4977-4b40-a26c-bded7ccbe72e","Type":"ContainerStarted","Data":"3dc5161e9171d93738d90611ec5ef3e4040844f3ccbec3d1fbf37608c3b512bd"} Jan 25 05:47:32 crc kubenswrapper[4728]: I0125 05:47:32.201236 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" event={"ID":"e7eaed33-a3a8-45fd-b1be-9bec59f65967","Type":"ContainerStarted","Data":"9156de0512852974a9130b8a29ee57b7862c35496ba9cce1fc6ad2f00ea43bca"} Jan 25 05:47:32 crc kubenswrapper[4728]: I0125 05:47:32.201689 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" Jan 25 05:47:32 crc kubenswrapper[4728]: I0125 05:47:32.215560 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-98wvg" podStartSLOduration=2.509424342 podStartE2EDuration="5.215536732s" podCreationTimestamp="2026-01-25 05:47:27 +0000 UTC" firstStartedPulling="2026-01-25 05:47:28.523679275 +0000 UTC m=+539.559557256" lastFinishedPulling="2026-01-25 05:47:31.229791667 +0000 UTC m=+542.265669646" observedRunningTime="2026-01-25 05:47:32.210441344 +0000 UTC m=+543.246319324" watchObservedRunningTime="2026-01-25 05:47:32.215536732 +0000 UTC m=+543.251414712" Jan 25 05:47:32 crc kubenswrapper[4728]: I0125 05:47:32.227184 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" podStartSLOduration=2.5649803110000002 podStartE2EDuration="5.227172385s" podCreationTimestamp="2026-01-25 05:47:27 +0000 UTC" firstStartedPulling="2026-01-25 05:47:28.576613602 +0000 UTC m=+539.612491581" lastFinishedPulling="2026-01-25 05:47:31.238805685 +0000 UTC m=+542.274683655" observedRunningTime="2026-01-25 05:47:32.224442988 +0000 UTC m=+543.260320968" watchObservedRunningTime="2026-01-25 05:47:32.227172385 +0000 UTC m=+543.263050366" Jan 25 05:47:32 crc kubenswrapper[4728]: I0125 05:47:32.240116 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-c5nfr" podStartSLOduration=2.642243448 podStartE2EDuration="5.240088873s" podCreationTimestamp="2026-01-25 05:47:27 +0000 UTC" firstStartedPulling="2026-01-25 05:47:28.638424137 +0000 UTC m=+539.674302117" lastFinishedPulling="2026-01-25 05:47:31.236269563 +0000 UTC m=+542.272147542" observedRunningTime="2026-01-25 05:47:32.237469804 +0000 UTC m=+543.273347784" watchObservedRunningTime="2026-01-25 05:47:32.240088873 +0000 UTC m=+543.275966843" Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.189280 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-mm5zm" Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.724089 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zmqrx"] Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.724474 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovn-controller" containerID="cri-o://84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325" gracePeriod=30 Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.724580 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641" gracePeriod=30 Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.724611 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="northd" containerID="cri-o://07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697" gracePeriod=30 Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.724733 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kube-rbac-proxy-node" containerID="cri-o://e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426" gracePeriod=30 Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.724776 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="sbdb" containerID="cri-o://ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8" gracePeriod=30 Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.724824 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="nbdb" containerID="cri-o://68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10" gracePeriod=30 Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.724835 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovn-acl-logging" containerID="cri-o://d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd" gracePeriod=30 Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.746014 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" containerID="cri-o://b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a" gracePeriod=30 Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.988881 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/3.log" Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.994361 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovn-acl-logging/0.log" Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.994887 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovn-controller/0.log" Jan 25 05:47:38 crc kubenswrapper[4728]: I0125 05:47:38.995599 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.044668 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nz9fw"] Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.044983 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="northd" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045005 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="northd" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045019 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045027 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045034 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045041 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045050 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045056 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045064 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovn-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045070 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovn-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045082 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kubecfg-setup" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045091 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kubecfg-setup" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045106 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="nbdb" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045113 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="nbdb" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045121 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="sbdb" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045127 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="sbdb" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045134 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovn-acl-logging" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045140 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovn-acl-logging" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045148 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kube-rbac-proxy-ovn-metrics" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045155 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kube-rbac-proxy-ovn-metrics" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045164 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kube-rbac-proxy-node" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045171 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kube-rbac-proxy-node" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045260 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="northd" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045269 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045277 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045285 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovn-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045293 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="nbdb" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045301 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="sbdb" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045328 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045337 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045347 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kube-rbac-proxy-ovn-metrics" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045355 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="kube-rbac-proxy-node" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045363 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovn-acl-logging" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045457 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045465 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.045473 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045479 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.045577 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerName="ovnkube-controller" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.047098 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175419 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-systemd-units\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175474 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-slash\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175517 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-systemd\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175533 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175535 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-log-socket\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175604 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-log-socket" (OuterVolumeSpecName: "log-socket") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175645 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-slash" (OuterVolumeSpecName: "host-slash") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175670 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-ovn\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175695 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175737 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw4hx\" (UniqueName: \"kubernetes.io/projected/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-kube-api-access-dw4hx\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175765 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-kubelet\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175797 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175799 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-env-overrides\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175867 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-bin\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175887 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-netns\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175910 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-node-log\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175939 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-etc-openvswitch\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175957 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175964 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175999 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-node-log" (OuterVolumeSpecName: "node-log") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.175999 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-script-lib\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176013 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176003 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176059 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176037 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-netd\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176090 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176130 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-var-lib-openvswitch\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176181 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovn-node-metrics-cert\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176208 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-openvswitch\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176230 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-ovn-kubernetes\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176264 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-config\") pod \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\" (UID: \"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b\") " Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176275 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176276 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176312 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176494 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovnkube-config\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176533 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-cni-bin\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176558 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-node-log\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176589 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjdn\" (UniqueName: \"kubernetes.io/projected/f7d0a9c3-84ef-4ea0-97c6-68629846367f-kube-api-access-8jjdn\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176630 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-run-netns\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176666 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-etc-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176718 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-log-socket\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176739 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176774 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovnkube-script-lib\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.176996 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177013 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177076 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-ovn\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177115 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovn-node-metrics-cert\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177149 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-var-lib-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177171 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177190 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-kubelet\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177213 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-cni-netd\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177232 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177251 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-systemd-units\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177276 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-slash\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177292 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-systemd\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177334 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177454 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-env-overrides\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177529 4728 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177554 4728 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177565 4728 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-node-log\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177578 4728 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177588 4728 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177602 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177613 4728 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177622 4728 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177631 4728 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177642 4728 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177660 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177673 4728 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177681 4728 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-slash\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177689 4728 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-log-socket\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177701 4728 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177712 4728 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.177723 4728 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.183532 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.183537 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-kube-api-access-dw4hx" (OuterVolumeSpecName: "kube-api-access-dw4hx") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "kube-api-access-dw4hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.190765 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" (UID: "5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.238707 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/2.log" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.239147 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/1.log" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.239197 4728 generic.go:334] "Generic (PLEG): container finished" podID="c2ffc038-3d70-4d2c-b150-e8529f622238" containerID="02594a5778eadcc12813b7374fa9212bd49759f86eda609cec8b87645a3c371e" exitCode=2 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.239267 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdxw7" event={"ID":"c2ffc038-3d70-4d2c-b150-e8529f622238","Type":"ContainerDied","Data":"02594a5778eadcc12813b7374fa9212bd49759f86eda609cec8b87645a3c371e"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.239342 4728 scope.go:117] "RemoveContainer" containerID="18ccc74364f8ced6d11a49c6a938ad9d631766948fbe9bc01fee40e3bdbdc991" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.240508 4728 scope.go:117] "RemoveContainer" containerID="02594a5778eadcc12813b7374fa9212bd49759f86eda609cec8b87645a3c371e" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.241630 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-kdxw7_openshift-multus(c2ffc038-3d70-4d2c-b150-e8529f622238)\"" pod="openshift-multus/multus-kdxw7" podUID="c2ffc038-3d70-4d2c-b150-e8529f622238" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.241761 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovnkube-controller/3.log" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.244136 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovn-acl-logging/0.log" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.244617 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmqrx_5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/ovn-controller/0.log" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245430 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a" exitCode=0 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245463 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8" exitCode=0 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245476 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10" exitCode=0 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245487 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697" exitCode=0 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245495 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641" exitCode=0 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245504 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426" exitCode=0 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245514 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd" exitCode=143 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245523 4728 generic.go:334] "Generic (PLEG): container finished" podID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" containerID="84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325" exitCode=143 Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245550 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245588 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245604 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245620 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245634 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245646 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245673 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245685 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245693 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245702 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245711 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245719 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245726 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245733 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245738 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245745 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245752 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245761 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245767 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245773 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245777 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245783 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245789 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245794 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245799 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245804 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245809 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245816 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245823 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245829 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245834 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245840 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245845 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245850 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245856 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245862 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245867 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245873 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245879 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" event={"ID":"5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b","Type":"ContainerDied","Data":"9db384fe5a3596a2ad0fd792fae12d6dadd0d16e2a7c5f023504eacf4a041f33"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245887 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245893 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245898 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245903 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245908 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245913 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245918 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245923 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245928 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245933 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.245692 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zmqrx" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278581 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-ovn\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278625 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovn-node-metrics-cert\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278663 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-var-lib-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278691 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278711 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-kubelet\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278729 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-cni-netd\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278750 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278768 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-systemd-units\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278786 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-slash\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278802 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-systemd\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278866 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-env-overrides\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278883 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovnkube-config\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278908 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-cni-bin\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278929 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-node-log\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278960 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjdn\" (UniqueName: \"kubernetes.io/projected/f7d0a9c3-84ef-4ea0-97c6-68629846367f-kube-api-access-8jjdn\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278976 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-run-netns\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.278997 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-etc-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.279017 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-log-socket\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.279035 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.279067 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovnkube-script-lib\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.279109 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw4hx\" (UniqueName: \"kubernetes.io/projected/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-kube-api-access-dw4hx\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.279315 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.279348 4728 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.280507 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovnkube-script-lib\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.280566 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-ovn\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.281233 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.281274 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-var-lib-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.281300 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.281341 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-kubelet\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.281363 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-cni-netd\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.281789 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-env-overrides\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.282409 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-systemd-units\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.282443 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-slash\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.282466 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-run-systemd\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.282739 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-cni-bin\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.282769 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-etc-openvswitch\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.283256 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovn-node-metrics-cert\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.283299 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-log-socket\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.283338 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.283363 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-host-run-netns\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.283392 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7d0a9c3-84ef-4ea0-97c6-68629846367f-node-log\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.283840 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7d0a9c3-84ef-4ea0-97c6-68629846367f-ovnkube-config\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.285902 4728 scope.go:117] "RemoveContainer" containerID="b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.301275 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zmqrx"] Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.303822 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.304488 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zmqrx"] Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.310279 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjdn\" (UniqueName: \"kubernetes.io/projected/f7d0a9c3-84ef-4ea0-97c6-68629846367f-kube-api-access-8jjdn\") pod \"ovnkube-node-nz9fw\" (UID: \"f7d0a9c3-84ef-4ea0-97c6-68629846367f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.338701 4728 scope.go:117] "RemoveContainer" containerID="ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.347730 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b" path="/var/lib/kubelet/pods/5eb5cbcd-e874-4d07-a231-0eb38ef5fc5b/volumes" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.355191 4728 scope.go:117] "RemoveContainer" containerID="68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.360783 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.365093 4728 scope.go:117] "RemoveContainer" containerID="07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.377065 4728 scope.go:117] "RemoveContainer" containerID="a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.389398 4728 scope.go:117] "RemoveContainer" containerID="e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.406037 4728 scope.go:117] "RemoveContainer" containerID="d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.418159 4728 scope.go:117] "RemoveContainer" containerID="84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.443727 4728 scope.go:117] "RemoveContainer" containerID="ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.473278 4728 scope.go:117] "RemoveContainer" containerID="b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.473744 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": container with ID starting with b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a not found: ID does not exist" containerID="b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.473796 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} err="failed to get container status \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": rpc error: code = NotFound desc = could not find container \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": container with ID starting with b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.473828 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.474281 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": container with ID starting with 5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938 not found: ID does not exist" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.474309 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} err="failed to get container status \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": rpc error: code = NotFound desc = could not find container \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": container with ID starting with 5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.474350 4728 scope.go:117] "RemoveContainer" containerID="ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.475390 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": container with ID starting with ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8 not found: ID does not exist" containerID="ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.475421 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} err="failed to get container status \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": rpc error: code = NotFound desc = could not find container \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": container with ID starting with ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.475437 4728 scope.go:117] "RemoveContainer" containerID="68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.475802 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": container with ID starting with 68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10 not found: ID does not exist" containerID="68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.475822 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} err="failed to get container status \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": rpc error: code = NotFound desc = could not find container \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": container with ID starting with 68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.475835 4728 scope.go:117] "RemoveContainer" containerID="07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.476092 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": container with ID starting with 07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697 not found: ID does not exist" containerID="07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.476117 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} err="failed to get container status \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": rpc error: code = NotFound desc = could not find container \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": container with ID starting with 07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.476132 4728 scope.go:117] "RemoveContainer" containerID="a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.476405 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": container with ID starting with a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641 not found: ID does not exist" containerID="a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.476421 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} err="failed to get container status \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": rpc error: code = NotFound desc = could not find container \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": container with ID starting with a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.476437 4728 scope.go:117] "RemoveContainer" containerID="e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.476635 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": container with ID starting with e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426 not found: ID does not exist" containerID="e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.476662 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} err="failed to get container status \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": rpc error: code = NotFound desc = could not find container \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": container with ID starting with e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.476678 4728 scope.go:117] "RemoveContainer" containerID="d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.476870 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": container with ID starting with d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd not found: ID does not exist" containerID="d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.476888 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} err="failed to get container status \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": rpc error: code = NotFound desc = could not find container \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": container with ID starting with d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.476902 4728 scope.go:117] "RemoveContainer" containerID="84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.477076 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": container with ID starting with 84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325 not found: ID does not exist" containerID="84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.477091 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} err="failed to get container status \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": rpc error: code = NotFound desc = could not find container \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": container with ID starting with 84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.477103 4728 scope.go:117] "RemoveContainer" containerID="ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f" Jan 25 05:47:39 crc kubenswrapper[4728]: E0125 05:47:39.477279 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": container with ID starting with ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f not found: ID does not exist" containerID="ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.477295 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} err="failed to get container status \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": rpc error: code = NotFound desc = could not find container \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": container with ID starting with ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.477309 4728 scope.go:117] "RemoveContainer" containerID="b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.478044 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} err="failed to get container status \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": rpc error: code = NotFound desc = could not find container \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": container with ID starting with b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.478091 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.478456 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} err="failed to get container status \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": rpc error: code = NotFound desc = could not find container \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": container with ID starting with 5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.478503 4728 scope.go:117] "RemoveContainer" containerID="ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.478892 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} err="failed to get container status \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": rpc error: code = NotFound desc = could not find container \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": container with ID starting with ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.478916 4728 scope.go:117] "RemoveContainer" containerID="68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.479646 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} err="failed to get container status \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": rpc error: code = NotFound desc = could not find container \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": container with ID starting with 68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.479683 4728 scope.go:117] "RemoveContainer" containerID="07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.480350 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} err="failed to get container status \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": rpc error: code = NotFound desc = could not find container \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": container with ID starting with 07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.480382 4728 scope.go:117] "RemoveContainer" containerID="a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.481267 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} err="failed to get container status \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": rpc error: code = NotFound desc = could not find container \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": container with ID starting with a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.481294 4728 scope.go:117] "RemoveContainer" containerID="e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.481693 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} err="failed to get container status \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": rpc error: code = NotFound desc = could not find container \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": container with ID starting with e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.481757 4728 scope.go:117] "RemoveContainer" containerID="d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.482213 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} err="failed to get container status \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": rpc error: code = NotFound desc = could not find container \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": container with ID starting with d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.482272 4728 scope.go:117] "RemoveContainer" containerID="84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.482570 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} err="failed to get container status \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": rpc error: code = NotFound desc = could not find container \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": container with ID starting with 84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.482602 4728 scope.go:117] "RemoveContainer" containerID="ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.482905 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} err="failed to get container status \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": rpc error: code = NotFound desc = could not find container \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": container with ID starting with ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.482985 4728 scope.go:117] "RemoveContainer" containerID="b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.483365 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} err="failed to get container status \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": rpc error: code = NotFound desc = could not find container \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": container with ID starting with b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.483392 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.483673 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} err="failed to get container status \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": rpc error: code = NotFound desc = could not find container \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": container with ID starting with 5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.483755 4728 scope.go:117] "RemoveContainer" containerID="ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.484155 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} err="failed to get container status \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": rpc error: code = NotFound desc = could not find container \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": container with ID starting with ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.484189 4728 scope.go:117] "RemoveContainer" containerID="68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.484491 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} err="failed to get container status \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": rpc error: code = NotFound desc = could not find container \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": container with ID starting with 68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.484584 4728 scope.go:117] "RemoveContainer" containerID="07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.484907 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} err="failed to get container status \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": rpc error: code = NotFound desc = could not find container \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": container with ID starting with 07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.484982 4728 scope.go:117] "RemoveContainer" containerID="a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.485269 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} err="failed to get container status \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": rpc error: code = NotFound desc = could not find container \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": container with ID starting with a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.485374 4728 scope.go:117] "RemoveContainer" containerID="e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.485645 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} err="failed to get container status \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": rpc error: code = NotFound desc = could not find container \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": container with ID starting with e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.485730 4728 scope.go:117] "RemoveContainer" containerID="d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.486016 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} err="failed to get container status \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": rpc error: code = NotFound desc = could not find container \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": container with ID starting with d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.486083 4728 scope.go:117] "RemoveContainer" containerID="84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.486433 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} err="failed to get container status \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": rpc error: code = NotFound desc = could not find container \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": container with ID starting with 84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.486486 4728 scope.go:117] "RemoveContainer" containerID="ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.486821 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} err="failed to get container status \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": rpc error: code = NotFound desc = could not find container \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": container with ID starting with ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.486845 4728 scope.go:117] "RemoveContainer" containerID="b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.487125 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a"} err="failed to get container status \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": rpc error: code = NotFound desc = could not find container \"b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a\": container with ID starting with b523f242da5924953846891c45919170c487587d5dfb3f2b455b92c3e31f415a not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.487201 4728 scope.go:117] "RemoveContainer" containerID="5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.487549 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938"} err="failed to get container status \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": rpc error: code = NotFound desc = could not find container \"5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938\": container with ID starting with 5399da414ab16bbd1d13003aac1c611b6c09f266ba38193859f194e2c0c19938 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.487597 4728 scope.go:117] "RemoveContainer" containerID="ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.488740 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8"} err="failed to get container status \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": rpc error: code = NotFound desc = could not find container \"ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8\": container with ID starting with ddbde3f645c20f712032f03f71dc8f5e27c352ad9a29f9a15c7d6f51081645a8 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.488819 4728 scope.go:117] "RemoveContainer" containerID="68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.489405 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10"} err="failed to get container status \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": rpc error: code = NotFound desc = could not find container \"68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10\": container with ID starting with 68a962601c2a222de1dd8064cab405638b341a38f21be7f285a0e9cbb5064d10 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.489461 4728 scope.go:117] "RemoveContainer" containerID="07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.489753 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697"} err="failed to get container status \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": rpc error: code = NotFound desc = could not find container \"07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697\": container with ID starting with 07203220bc006d244d1fcc86e12e0b172f4bdbd6aed3e259f6a46a5aa5b29697 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.489838 4728 scope.go:117] "RemoveContainer" containerID="a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.490287 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641"} err="failed to get container status \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": rpc error: code = NotFound desc = could not find container \"a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641\": container with ID starting with a58a6a07e91d8441bdbb7168c582e02e84a54c324b56ad5a9b6ff712252a6641 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.490317 4728 scope.go:117] "RemoveContainer" containerID="e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.490669 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426"} err="failed to get container status \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": rpc error: code = NotFound desc = could not find container \"e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426\": container with ID starting with e1a2c8333c59d092f075948f2cd7373d7bd7b896581650c6ab34d654c4c51426 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.490691 4728 scope.go:117] "RemoveContainer" containerID="d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.490998 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd"} err="failed to get container status \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": rpc error: code = NotFound desc = could not find container \"d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd\": container with ID starting with d3ac2cc8c66aa32154c1b1273b255a8e234a33f187362da674e646137f45e8dd not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.491021 4728 scope.go:117] "RemoveContainer" containerID="84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.491390 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325"} err="failed to get container status \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": rpc error: code = NotFound desc = could not find container \"84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325\": container with ID starting with 84a54f578387a91ecb9f6374a397973d378fbc3f03120095d8ee62f4fdeca325 not found: ID does not exist" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.491424 4728 scope.go:117] "RemoveContainer" containerID="ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f" Jan 25 05:47:39 crc kubenswrapper[4728]: I0125 05:47:39.491860 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f"} err="failed to get container status \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": rpc error: code = NotFound desc = could not find container \"ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f\": container with ID starting with ca6e327b919ae3e721bcb22c152290dc34c767578bd5c479d009f95842b4077f not found: ID does not exist" Jan 25 05:47:40 crc kubenswrapper[4728]: I0125 05:47:40.251971 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/2.log" Jan 25 05:47:40 crc kubenswrapper[4728]: I0125 05:47:40.254485 4728 generic.go:334] "Generic (PLEG): container finished" podID="f7d0a9c3-84ef-4ea0-97c6-68629846367f" containerID="43af3b5154c6f5341a493042cc55e58338c02398329e91efe8ef394e58438850" exitCode=0 Jan 25 05:47:40 crc kubenswrapper[4728]: I0125 05:47:40.254535 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerDied","Data":"43af3b5154c6f5341a493042cc55e58338c02398329e91efe8ef394e58438850"} Jan 25 05:47:40 crc kubenswrapper[4728]: I0125 05:47:40.254598 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"156e306766956c5d1734a330bf1d45d439a0d27cfa4f4b21d69c921c71df72fa"} Jan 25 05:47:41 crc kubenswrapper[4728]: I0125 05:47:41.268840 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"8a7e0914571fe9ba2aac80b0e78a04efac4b48d36943fe40b4ee5d777bd57ec3"} Jan 25 05:47:41 crc kubenswrapper[4728]: I0125 05:47:41.269200 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"a4a952355b9b72096fd97e12615f64800d46fab210cbb36e011036212a14983e"} Jan 25 05:47:41 crc kubenswrapper[4728]: I0125 05:47:41.269212 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"3868accaed66923013013d6c7986042eaf13a25ce29fcafca5901b3471dc9943"} Jan 25 05:47:41 crc kubenswrapper[4728]: I0125 05:47:41.269222 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"762e75a08ad1a903d9a390acf4648c40c8841d782dd4671cd545e66e44babed8"} Jan 25 05:47:41 crc kubenswrapper[4728]: I0125 05:47:41.269231 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"c899d26c8ba907bd26a02cae4d303d585a964c5488faef4d72823237b5451812"} Jan 25 05:47:41 crc kubenswrapper[4728]: I0125 05:47:41.269240 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"b93779f27906c119e8d47eedf9d99f47576613c7b78b276aa9aa755933972174"} Jan 25 05:47:43 crc kubenswrapper[4728]: I0125 05:47:43.282761 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"ea73f309923513efa49f0164d58ac775714d4189bbc960778b9c5eefa2f05257"} Jan 25 05:47:45 crc kubenswrapper[4728]: I0125 05:47:45.293730 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" event={"ID":"f7d0a9c3-84ef-4ea0-97c6-68629846367f","Type":"ContainerStarted","Data":"2d135cb14e1b64bdcf6599c12bcbb2aab5ad2b1aa9cae5edf81006ef38deca6f"} Jan 25 05:47:45 crc kubenswrapper[4728]: I0125 05:47:45.295530 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:45 crc kubenswrapper[4728]: I0125 05:47:45.295557 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:45 crc kubenswrapper[4728]: I0125 05:47:45.295609 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:45 crc kubenswrapper[4728]: I0125 05:47:45.318697 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:45 crc kubenswrapper[4728]: I0125 05:47:45.318998 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:47:45 crc kubenswrapper[4728]: I0125 05:47:45.329800 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" podStartSLOduration=6.329791345 podStartE2EDuration="6.329791345s" podCreationTimestamp="2026-01-25 05:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:47:45.326922906 +0000 UTC m=+556.362800886" watchObservedRunningTime="2026-01-25 05:47:45.329791345 +0000 UTC m=+556.365669325" Jan 25 05:47:53 crc kubenswrapper[4728]: I0125 05:47:53.329221 4728 scope.go:117] "RemoveContainer" containerID="02594a5778eadcc12813b7374fa9212bd49759f86eda609cec8b87645a3c371e" Jan 25 05:47:53 crc kubenswrapper[4728]: E0125 05:47:53.330168 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-kdxw7_openshift-multus(c2ffc038-3d70-4d2c-b150-e8529f622238)\"" pod="openshift-multus/multus-kdxw7" podUID="c2ffc038-3d70-4d2c-b150-e8529f622238" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.265085 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm"] Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.267183 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.270703 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.278694 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm"] Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.328496 4728 scope.go:117] "RemoveContainer" containerID="02594a5778eadcc12813b7374fa9212bd49759f86eda609cec8b87645a3c371e" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.398926 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.399272 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.399314 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gcrd\" (UniqueName: \"kubernetes.io/projected/11334404-2639-4444-a499-8312bc233ad6-kube-api-access-6gcrd\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.500975 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.501032 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.501052 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gcrd\" (UniqueName: \"kubernetes.io/projected/11334404-2639-4444-a499-8312bc233ad6-kube-api-access-6gcrd\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.501705 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.501812 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.518930 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gcrd\" (UniqueName: \"kubernetes.io/projected/11334404-2639-4444-a499-8312bc233ad6-kube-api-access-6gcrd\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: I0125 05:48:07.588816 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: E0125 05:48:07.632006 4728 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace_11334404-2639-4444-a499-8312bc233ad6_0(e0feea57b2934fc39fa69a0c4d50f2ab1a11d864c5707ba06f37699f4a4cf6dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 25 05:48:07 crc kubenswrapper[4728]: E0125 05:48:07.632124 4728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace_11334404-2639-4444-a499-8312bc233ad6_0(e0feea57b2934fc39fa69a0c4d50f2ab1a11d864c5707ba06f37699f4a4cf6dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: E0125 05:48:07.632152 4728 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace_11334404-2639-4444-a499-8312bc233ad6_0(e0feea57b2934fc39fa69a0c4d50f2ab1a11d864c5707ba06f37699f4a4cf6dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:07 crc kubenswrapper[4728]: E0125 05:48:07.632217 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace(11334404-2639-4444-a499-8312bc233ad6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace(11334404-2639-4444-a499-8312bc233ad6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace_11334404-2639-4444-a499-8312bc233ad6_0(e0feea57b2934fc39fa69a0c4d50f2ab1a11d864c5707ba06f37699f4a4cf6dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" podUID="11334404-2639-4444-a499-8312bc233ad6" Jan 25 05:48:08 crc kubenswrapper[4728]: I0125 05:48:08.414011 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdxw7_c2ffc038-3d70-4d2c-b150-e8529f622238/kube-multus/2.log" Jan 25 05:48:08 crc kubenswrapper[4728]: I0125 05:48:08.414122 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:08 crc kubenswrapper[4728]: I0125 05:48:08.414118 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdxw7" event={"ID":"c2ffc038-3d70-4d2c-b150-e8529f622238","Type":"ContainerStarted","Data":"f9a24457ed8599eaba16d5b057ccf7e543681b794e80a02555853814a11ba829"} Jan 25 05:48:08 crc kubenswrapper[4728]: I0125 05:48:08.414465 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:08 crc kubenswrapper[4728]: E0125 05:48:08.441100 4728 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace_11334404-2639-4444-a499-8312bc233ad6_0(e00bef77a8a294d314ded07d22ab4aae7f315bedd66877b1f84f6142ae97f422): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 25 05:48:08 crc kubenswrapper[4728]: E0125 05:48:08.441155 4728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace_11334404-2639-4444-a499-8312bc233ad6_0(e00bef77a8a294d314ded07d22ab4aae7f315bedd66877b1f84f6142ae97f422): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:08 crc kubenswrapper[4728]: E0125 05:48:08.441178 4728 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace_11334404-2639-4444-a499-8312bc233ad6_0(e00bef77a8a294d314ded07d22ab4aae7f315bedd66877b1f84f6142ae97f422): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:08 crc kubenswrapper[4728]: E0125 05:48:08.441219 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace(11334404-2639-4444-a499-8312bc233ad6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace(11334404-2639-4444-a499-8312bc233ad6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_openshift-marketplace_11334404-2639-4444-a499-8312bc233ad6_0(e00bef77a8a294d314ded07d22ab4aae7f315bedd66877b1f84f6142ae97f422): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" podUID="11334404-2639-4444-a499-8312bc233ad6" Jan 25 05:48:09 crc kubenswrapper[4728]: I0125 05:48:09.381882 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz9fw" Jan 25 05:48:12 crc kubenswrapper[4728]: I0125 05:48:12.899269 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:48:12 crc kubenswrapper[4728]: I0125 05:48:12.899385 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:48:19 crc kubenswrapper[4728]: I0125 05:48:19.328565 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:19 crc kubenswrapper[4728]: I0125 05:48:19.330977 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:19 crc kubenswrapper[4728]: I0125 05:48:19.702590 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm"] Jan 25 05:48:20 crc kubenswrapper[4728]: I0125 05:48:20.481050 4728 generic.go:334] "Generic (PLEG): container finished" podID="11334404-2639-4444-a499-8312bc233ad6" containerID="0d298668649a83458b8c9c3eb9a6e60739ac8d549404f13e1a265ca27cbdffb4" exitCode=0 Jan 25 05:48:20 crc kubenswrapper[4728]: I0125 05:48:20.481107 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" event={"ID":"11334404-2639-4444-a499-8312bc233ad6","Type":"ContainerDied","Data":"0d298668649a83458b8c9c3eb9a6e60739ac8d549404f13e1a265ca27cbdffb4"} Jan 25 05:48:20 crc kubenswrapper[4728]: I0125 05:48:20.481142 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" event={"ID":"11334404-2639-4444-a499-8312bc233ad6","Type":"ContainerStarted","Data":"465d3ee8bee10dc17e26fd87b66d518a02b59b0c1da62b101980b2b1292136b9"} Jan 25 05:48:22 crc kubenswrapper[4728]: I0125 05:48:22.492114 4728 generic.go:334] "Generic (PLEG): container finished" podID="11334404-2639-4444-a499-8312bc233ad6" containerID="bae86c7294b738138d55bf5fb2a26b0e6975fcd766ab8eff8ad0860146107275" exitCode=0 Jan 25 05:48:22 crc kubenswrapper[4728]: I0125 05:48:22.492222 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" event={"ID":"11334404-2639-4444-a499-8312bc233ad6","Type":"ContainerDied","Data":"bae86c7294b738138d55bf5fb2a26b0e6975fcd766ab8eff8ad0860146107275"} Jan 25 05:48:23 crc kubenswrapper[4728]: I0125 05:48:23.501513 4728 generic.go:334] "Generic (PLEG): container finished" podID="11334404-2639-4444-a499-8312bc233ad6" containerID="3bc37e1bf725517e1c09241b22450198d08e95cbe187132dd2e9e0a0fa76a7d2" exitCode=0 Jan 25 05:48:23 crc kubenswrapper[4728]: I0125 05:48:23.501602 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" event={"ID":"11334404-2639-4444-a499-8312bc233ad6","Type":"ContainerDied","Data":"3bc37e1bf725517e1c09241b22450198d08e95cbe187132dd2e9e0a0fa76a7d2"} Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.709990 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.882433 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gcrd\" (UniqueName: \"kubernetes.io/projected/11334404-2639-4444-a499-8312bc233ad6-kube-api-access-6gcrd\") pod \"11334404-2639-4444-a499-8312bc233ad6\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.882543 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-util\") pod \"11334404-2639-4444-a499-8312bc233ad6\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.882585 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-bundle\") pod \"11334404-2639-4444-a499-8312bc233ad6\" (UID: \"11334404-2639-4444-a499-8312bc233ad6\") " Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.883117 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-bundle" (OuterVolumeSpecName: "bundle") pod "11334404-2639-4444-a499-8312bc233ad6" (UID: "11334404-2639-4444-a499-8312bc233ad6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.888938 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11334404-2639-4444-a499-8312bc233ad6-kube-api-access-6gcrd" (OuterVolumeSpecName: "kube-api-access-6gcrd") pod "11334404-2639-4444-a499-8312bc233ad6" (UID: "11334404-2639-4444-a499-8312bc233ad6"). InnerVolumeSpecName "kube-api-access-6gcrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.892804 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-util" (OuterVolumeSpecName: "util") pod "11334404-2639-4444-a499-8312bc233ad6" (UID: "11334404-2639-4444-a499-8312bc233ad6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.984112 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gcrd\" (UniqueName: \"kubernetes.io/projected/11334404-2639-4444-a499-8312bc233ad6-kube-api-access-6gcrd\") on node \"crc\" DevicePath \"\"" Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.984146 4728 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-util\") on node \"crc\" DevicePath \"\"" Jan 25 05:48:24 crc kubenswrapper[4728]: I0125 05:48:24.984156 4728 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11334404-2639-4444-a499-8312bc233ad6-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:48:25 crc kubenswrapper[4728]: I0125 05:48:25.514532 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" event={"ID":"11334404-2639-4444-a499-8312bc233ad6","Type":"ContainerDied","Data":"465d3ee8bee10dc17e26fd87b66d518a02b59b0c1da62b101980b2b1292136b9"} Jan 25 05:48:25 crc kubenswrapper[4728]: I0125 05:48:25.514605 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="465d3ee8bee10dc17e26fd87b66d518a02b59b0c1da62b101980b2b1292136b9" Jan 25 05:48:25 crc kubenswrapper[4728]: I0125 05:48:25.514651 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.829182 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-cr8sf"] Jan 25 05:48:28 crc kubenswrapper[4728]: E0125 05:48:28.829705 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11334404-2639-4444-a499-8312bc233ad6" containerName="pull" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.829719 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="11334404-2639-4444-a499-8312bc233ad6" containerName="pull" Jan 25 05:48:28 crc kubenswrapper[4728]: E0125 05:48:28.829737 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11334404-2639-4444-a499-8312bc233ad6" containerName="extract" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.829743 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="11334404-2639-4444-a499-8312bc233ad6" containerName="extract" Jan 25 05:48:28 crc kubenswrapper[4728]: E0125 05:48:28.829763 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11334404-2639-4444-a499-8312bc233ad6" containerName="util" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.829769 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="11334404-2639-4444-a499-8312bc233ad6" containerName="util" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.829870 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="11334404-2639-4444-a499-8312bc233ad6" containerName="extract" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.830289 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-cr8sf" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.832526 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6dksq" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.832584 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.833973 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.843517 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-cr8sf"] Jan 25 05:48:28 crc kubenswrapper[4728]: I0125 05:48:28.929113 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz89s\" (UniqueName: \"kubernetes.io/projected/8a8de132-1d94-4947-bc4e-0968643f10e0-kube-api-access-nz89s\") pod \"nmstate-operator-646758c888-cr8sf\" (UID: \"8a8de132-1d94-4947-bc4e-0968643f10e0\") " pod="openshift-nmstate/nmstate-operator-646758c888-cr8sf" Jan 25 05:48:29 crc kubenswrapper[4728]: I0125 05:48:29.030461 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz89s\" (UniqueName: \"kubernetes.io/projected/8a8de132-1d94-4947-bc4e-0968643f10e0-kube-api-access-nz89s\") pod \"nmstate-operator-646758c888-cr8sf\" (UID: \"8a8de132-1d94-4947-bc4e-0968643f10e0\") " pod="openshift-nmstate/nmstate-operator-646758c888-cr8sf" Jan 25 05:48:29 crc kubenswrapper[4728]: I0125 05:48:29.049000 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz89s\" (UniqueName: \"kubernetes.io/projected/8a8de132-1d94-4947-bc4e-0968643f10e0-kube-api-access-nz89s\") pod \"nmstate-operator-646758c888-cr8sf\" (UID: \"8a8de132-1d94-4947-bc4e-0968643f10e0\") " pod="openshift-nmstate/nmstate-operator-646758c888-cr8sf" Jan 25 05:48:29 crc kubenswrapper[4728]: I0125 05:48:29.142877 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-cr8sf" Jan 25 05:48:29 crc kubenswrapper[4728]: I0125 05:48:29.529517 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-cr8sf"] Jan 25 05:48:30 crc kubenswrapper[4728]: I0125 05:48:30.546725 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-cr8sf" event={"ID":"8a8de132-1d94-4947-bc4e-0968643f10e0","Type":"ContainerStarted","Data":"ab302282b8aab4df09ff3c8379493d2d8fe4a6c48f5abc81b5173588b350f459"} Jan 25 05:48:32 crc kubenswrapper[4728]: I0125 05:48:32.561854 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-cr8sf" event={"ID":"8a8de132-1d94-4947-bc4e-0968643f10e0","Type":"ContainerStarted","Data":"0be15ef5f02ce8f1a79c25e2b10818f8a6688b8c904f88d72ad9b73e6e386a4a"} Jan 25 05:48:32 crc kubenswrapper[4728]: I0125 05:48:32.579500 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-cr8sf" podStartSLOduration=2.6564597599999997 podStartE2EDuration="4.579476781s" podCreationTimestamp="2026-01-25 05:48:28 +0000 UTC" firstStartedPulling="2026-01-25 05:48:29.540349378 +0000 UTC m=+600.576227358" lastFinishedPulling="2026-01-25 05:48:31.463366399 +0000 UTC m=+602.499244379" observedRunningTime="2026-01-25 05:48:32.575167807 +0000 UTC m=+603.611045787" watchObservedRunningTime="2026-01-25 05:48:32.579476781 +0000 UTC m=+603.615354761" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.546888 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-zzzcv"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.548101 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.549944 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qrz95" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.555014 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-zzzcv"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.566047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpc2v\" (UniqueName: \"kubernetes.io/projected/90dfd7eb-b907-4cc3-95c5-69d9cb694372-kube-api-access-rpc2v\") pod \"nmstate-metrics-54757c584b-zzzcv\" (UID: \"90dfd7eb-b907-4cc3-95c5-69d9cb694372\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.576450 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6qpdd"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.577231 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.590777 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.592135 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.595853 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.605309 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.657293 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.658099 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.659370 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.659464 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mctbr" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.659964 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.667458 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-nmstate-lock\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.667493 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-ovs-socket\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.667531 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/67c7adf0-d43f-47b5-8997-4d691eee4e4f-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-528j4\" (UID: \"67c7adf0-d43f-47b5-8997-4d691eee4e4f\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.667551 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdv4p\" (UniqueName: \"kubernetes.io/projected/67c7adf0-d43f-47b5-8997-4d691eee4e4f-kube-api-access-gdv4p\") pod \"nmstate-webhook-8474b5b9d8-528j4\" (UID: \"67c7adf0-d43f-47b5-8997-4d691eee4e4f\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.667576 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j76gv\" (UniqueName: \"kubernetes.io/projected/417875b7-d358-4db4-ad01-1e31c98e4955-kube-api-access-j76gv\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.667647 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpc2v\" (UniqueName: \"kubernetes.io/projected/90dfd7eb-b907-4cc3-95c5-69d9cb694372-kube-api-access-rpc2v\") pod \"nmstate-metrics-54757c584b-zzzcv\" (UID: \"90dfd7eb-b907-4cc3-95c5-69d9cb694372\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.667696 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-dbus-socket\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.669550 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.685099 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpc2v\" (UniqueName: \"kubernetes.io/projected/90dfd7eb-b907-4cc3-95c5-69d9cb694372-kube-api-access-rpc2v\") pod \"nmstate-metrics-54757c584b-zzzcv\" (UID: \"90dfd7eb-b907-4cc3-95c5-69d9cb694372\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769485 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-dbus-socket\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769537 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-nmstate-lock\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769569 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-ovs-socket\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769607 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ea342d-9a20-4776-80b3-0132cefb2983-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769642 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5f6\" (UniqueName: \"kubernetes.io/projected/a5ea342d-9a20-4776-80b3-0132cefb2983-kube-api-access-5p5f6\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769676 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/67c7adf0-d43f-47b5-8997-4d691eee4e4f-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-528j4\" (UID: \"67c7adf0-d43f-47b5-8997-4d691eee4e4f\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769676 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-nmstate-lock\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769703 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdv4p\" (UniqueName: \"kubernetes.io/projected/67c7adf0-d43f-47b5-8997-4d691eee4e4f-kube-api-access-gdv4p\") pod \"nmstate-webhook-8474b5b9d8-528j4\" (UID: \"67c7adf0-d43f-47b5-8997-4d691eee4e4f\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769787 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j76gv\" (UniqueName: \"kubernetes.io/projected/417875b7-d358-4db4-ad01-1e31c98e4955-kube-api-access-j76gv\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5ea342d-9a20-4776-80b3-0132cefb2983-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.769921 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-dbus-socket\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.770142 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/417875b7-d358-4db4-ad01-1e31c98e4955-ovs-socket\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: E0125 05:48:37.770143 4728 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 25 05:48:37 crc kubenswrapper[4728]: E0125 05:48:37.770252 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c7adf0-d43f-47b5-8997-4d691eee4e4f-tls-key-pair podName:67c7adf0-d43f-47b5-8997-4d691eee4e4f nodeName:}" failed. No retries permitted until 2026-01-25 05:48:38.270222403 +0000 UTC m=+609.306100384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/67c7adf0-d43f-47b5-8997-4d691eee4e4f-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-528j4" (UID: "67c7adf0-d43f-47b5-8997-4d691eee4e4f") : secret "openshift-nmstate-webhook" not found Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.789887 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdv4p\" (UniqueName: \"kubernetes.io/projected/67c7adf0-d43f-47b5-8997-4d691eee4e4f-kube-api-access-gdv4p\") pod \"nmstate-webhook-8474b5b9d8-528j4\" (UID: \"67c7adf0-d43f-47b5-8997-4d691eee4e4f\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.793281 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j76gv\" (UniqueName: \"kubernetes.io/projected/417875b7-d358-4db4-ad01-1e31c98e4955-kube-api-access-j76gv\") pod \"nmstate-handler-6qpdd\" (UID: \"417875b7-d358-4db4-ad01-1e31c98e4955\") " pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.836945 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64b87774c4-k6hrl"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.837690 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.846190 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b87774c4-k6hrl"] Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.863417 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871002 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-oauth-serving-cert\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871054 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ea342d-9a20-4776-80b3-0132cefb2983-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871076 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-service-ca\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871099 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5f6\" (UniqueName: \"kubernetes.io/projected/a5ea342d-9a20-4776-80b3-0132cefb2983-kube-api-access-5p5f6\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871117 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-config\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871158 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-oauth-config\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871182 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-serving-cert\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871218 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5ea342d-9a20-4776-80b3-0132cefb2983-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871239 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-trusted-ca-bundle\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.871268 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzhdl\" (UniqueName: \"kubernetes.io/projected/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-kube-api-access-kzhdl\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.872063 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5ea342d-9a20-4776-80b3-0132cefb2983-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.874274 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ea342d-9a20-4776-80b3-0132cefb2983-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.890123 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5f6\" (UniqueName: \"kubernetes.io/projected/a5ea342d-9a20-4776-80b3-0132cefb2983-kube-api-access-5p5f6\") pod \"nmstate-console-plugin-7754f76f8b-2s8cq\" (UID: \"a5ea342d-9a20-4776-80b3-0132cefb2983\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.893163 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.970967 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.972454 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-oauth-config\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.972514 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-serving-cert\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.972564 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-trusted-ca-bundle\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.972615 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzhdl\" (UniqueName: \"kubernetes.io/projected/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-kube-api-access-kzhdl\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.972648 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-oauth-serving-cert\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.972668 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-service-ca\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.972690 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-config\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.974019 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-oauth-serving-cert\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.974225 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-config\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.974737 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-trusted-ca-bundle\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.975384 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-service-ca\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.977526 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-serving-cert\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.977615 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-console-oauth-config\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:37 crc kubenswrapper[4728]: I0125 05:48:37.995707 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzhdl\" (UniqueName: \"kubernetes.io/projected/0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a-kube-api-access-kzhdl\") pod \"console-64b87774c4-k6hrl\" (UID: \"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a\") " pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.029730 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-zzzcv"] Jan 25 05:48:38 crc kubenswrapper[4728]: W0125 05:48:38.034245 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90dfd7eb_b907_4cc3_95c5_69d9cb694372.slice/crio-f59018d9d111c06ec828eb0f18517e738559f210bf9da70cd55d712f49af4bc6 WatchSource:0}: Error finding container f59018d9d111c06ec828eb0f18517e738559f210bf9da70cd55d712f49af4bc6: Status 404 returned error can't find the container with id f59018d9d111c06ec828eb0f18517e738559f210bf9da70cd55d712f49af4bc6 Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.149736 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.277013 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/67c7adf0-d43f-47b5-8997-4d691eee4e4f-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-528j4\" (UID: \"67c7adf0-d43f-47b5-8997-4d691eee4e4f\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.280507 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/67c7adf0-d43f-47b5-8997-4d691eee4e4f-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-528j4\" (UID: \"67c7adf0-d43f-47b5-8997-4d691eee4e4f\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.304945 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b87774c4-k6hrl"] Jan 25 05:48:38 crc kubenswrapper[4728]: W0125 05:48:38.309437 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0df0a47a_cbc6_4e9d_aa18_5b42eeaf262a.slice/crio-a42dd87959c4619ace14f22c488a12a7d37dd854102165c57735184b8bec917c WatchSource:0}: Error finding container a42dd87959c4619ace14f22c488a12a7d37dd854102165c57735184b8bec917c: Status 404 returned error can't find the container with id a42dd87959c4619ace14f22c488a12a7d37dd854102165c57735184b8bec917c Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.326395 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq"] Jan 25 05:48:38 crc kubenswrapper[4728]: W0125 05:48:38.330706 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5ea342d_9a20_4776_80b3_0132cefb2983.slice/crio-dcd152a85de0d6a37d5445dc7e87f606444eece72a6f7c658a1fd4e70610c095 WatchSource:0}: Error finding container dcd152a85de0d6a37d5445dc7e87f606444eece72a6f7c658a1fd4e70610c095: Status 404 returned error can't find the container with id dcd152a85de0d6a37d5445dc7e87f606444eece72a6f7c658a1fd4e70610c095 Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.510799 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.594170 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6qpdd" event={"ID":"417875b7-d358-4db4-ad01-1e31c98e4955","Type":"ContainerStarted","Data":"af8c8b8d9add766c67a40b9e385e4b4d45b60b88d5666aeacef0fb2cd17a89de"} Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.595638 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" event={"ID":"90dfd7eb-b907-4cc3-95c5-69d9cb694372","Type":"ContainerStarted","Data":"f59018d9d111c06ec828eb0f18517e738559f210bf9da70cd55d712f49af4bc6"} Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.596918 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" event={"ID":"a5ea342d-9a20-4776-80b3-0132cefb2983","Type":"ContainerStarted","Data":"dcd152a85de0d6a37d5445dc7e87f606444eece72a6f7c658a1fd4e70610c095"} Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.598271 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b87774c4-k6hrl" event={"ID":"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a","Type":"ContainerStarted","Data":"ae8297f60ce3bda2d6c12f5b0354164ad977375f221d0a25c47afe0fd3a8b803"} Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.598302 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b87774c4-k6hrl" event={"ID":"0df0a47a-cbc6-4e9d-aa18-5b42eeaf262a","Type":"ContainerStarted","Data":"a42dd87959c4619ace14f22c488a12a7d37dd854102165c57735184b8bec917c"} Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.621356 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64b87774c4-k6hrl" podStartSLOduration=1.621307915 podStartE2EDuration="1.621307915s" podCreationTimestamp="2026-01-25 05:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:48:38.614111117 +0000 UTC m=+609.649989096" watchObservedRunningTime="2026-01-25 05:48:38.621307915 +0000 UTC m=+609.657185895" Jan 25 05:48:38 crc kubenswrapper[4728]: I0125 05:48:38.882781 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4"] Jan 25 05:48:38 crc kubenswrapper[4728]: W0125 05:48:38.886141 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c7adf0_d43f_47b5_8997_4d691eee4e4f.slice/crio-cfe7390fbc9d68326881138aa365530712fbdf7d42b65b96595d049ceb4d9b3d WatchSource:0}: Error finding container cfe7390fbc9d68326881138aa365530712fbdf7d42b65b96595d049ceb4d9b3d: Status 404 returned error can't find the container with id cfe7390fbc9d68326881138aa365530712fbdf7d42b65b96595d049ceb4d9b3d Jan 25 05:48:39 crc kubenswrapper[4728]: I0125 05:48:39.604156 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" event={"ID":"67c7adf0-d43f-47b5-8997-4d691eee4e4f","Type":"ContainerStarted","Data":"cfe7390fbc9d68326881138aa365530712fbdf7d42b65b96595d049ceb4d9b3d"} Jan 25 05:48:40 crc kubenswrapper[4728]: I0125 05:48:40.609530 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" event={"ID":"90dfd7eb-b907-4cc3-95c5-69d9cb694372","Type":"ContainerStarted","Data":"2b17e780cfbe1c146ceb4a840e7329d60b652a802c0a8493df12deba5ad69231"} Jan 25 05:48:40 crc kubenswrapper[4728]: I0125 05:48:40.611779 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" event={"ID":"a5ea342d-9a20-4776-80b3-0132cefb2983","Type":"ContainerStarted","Data":"296cad66913865e1635e24ebcbce71c42374bc9379fe7fb2a550295b4fa989b9"} Jan 25 05:48:40 crc kubenswrapper[4728]: I0125 05:48:40.614101 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:40 crc kubenswrapper[4728]: I0125 05:48:40.615252 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" event={"ID":"67c7adf0-d43f-47b5-8997-4d691eee4e4f","Type":"ContainerStarted","Data":"532c053abf0a029bcef65dd099779357e355e66ab434c323319b0526ad3c4672"} Jan 25 05:48:40 crc kubenswrapper[4728]: I0125 05:48:40.615358 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:48:40 crc kubenswrapper[4728]: I0125 05:48:40.622612 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2s8cq" podStartSLOduration=1.574112534 podStartE2EDuration="3.622594413s" podCreationTimestamp="2026-01-25 05:48:37 +0000 UTC" firstStartedPulling="2026-01-25 05:48:38.333792442 +0000 UTC m=+609.369670422" lastFinishedPulling="2026-01-25 05:48:40.382274321 +0000 UTC m=+611.418152301" observedRunningTime="2026-01-25 05:48:40.6200684 +0000 UTC m=+611.655946381" watchObservedRunningTime="2026-01-25 05:48:40.622594413 +0000 UTC m=+611.658472393" Jan 25 05:48:40 crc kubenswrapper[4728]: I0125 05:48:40.632347 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6qpdd" podStartSLOduration=1.160320983 podStartE2EDuration="3.632306704s" podCreationTimestamp="2026-01-25 05:48:37 +0000 UTC" firstStartedPulling="2026-01-25 05:48:37.913535902 +0000 UTC m=+608.949413882" lastFinishedPulling="2026-01-25 05:48:40.385521633 +0000 UTC m=+611.421399603" observedRunningTime="2026-01-25 05:48:40.632163885 +0000 UTC m=+611.668041865" watchObservedRunningTime="2026-01-25 05:48:40.632306704 +0000 UTC m=+611.668184685" Jan 25 05:48:40 crc kubenswrapper[4728]: I0125 05:48:40.647935 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" podStartSLOduration=2.152810003 podStartE2EDuration="3.647919309s" podCreationTimestamp="2026-01-25 05:48:37 +0000 UTC" firstStartedPulling="2026-01-25 05:48:38.889030001 +0000 UTC m=+609.924907980" lastFinishedPulling="2026-01-25 05:48:40.384139306 +0000 UTC m=+611.420017286" observedRunningTime="2026-01-25 05:48:40.646919034 +0000 UTC m=+611.682797013" watchObservedRunningTime="2026-01-25 05:48:40.647919309 +0000 UTC m=+611.683797290" Jan 25 05:48:41 crc kubenswrapper[4728]: I0125 05:48:41.623360 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6qpdd" event={"ID":"417875b7-d358-4db4-ad01-1e31c98e4955","Type":"ContainerStarted","Data":"b55f7a66e01b232e084445c414f118b2bf397cd463c8e028fb246fe1644440c2"} Jan 25 05:48:42 crc kubenswrapper[4728]: I0125 05:48:42.632355 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" event={"ID":"90dfd7eb-b907-4cc3-95c5-69d9cb694372","Type":"ContainerStarted","Data":"87ec77a19615ebf75614fc72271da0e096db5909630358d881d7b7fcfe86e09a"} Jan 25 05:48:42 crc kubenswrapper[4728]: I0125 05:48:42.666550 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-zzzcv" podStartSLOduration=1.298303331 podStartE2EDuration="5.666528127s" podCreationTimestamp="2026-01-25 05:48:37 +0000 UTC" firstStartedPulling="2026-01-25 05:48:38.037083105 +0000 UTC m=+609.072961085" lastFinishedPulling="2026-01-25 05:48:42.4053079 +0000 UTC m=+613.441185881" observedRunningTime="2026-01-25 05:48:42.645937827 +0000 UTC m=+613.681815807" watchObservedRunningTime="2026-01-25 05:48:42.666528127 +0000 UTC m=+613.702406267" Jan 25 05:48:42 crc kubenswrapper[4728]: I0125 05:48:42.899407 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:48:42 crc kubenswrapper[4728]: I0125 05:48:42.899485 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:48:47 crc kubenswrapper[4728]: I0125 05:48:47.919824 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6qpdd" Jan 25 05:48:48 crc kubenswrapper[4728]: I0125 05:48:48.150750 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:48 crc kubenswrapper[4728]: I0125 05:48:48.150938 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:48 crc kubenswrapper[4728]: I0125 05:48:48.154826 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:48 crc kubenswrapper[4728]: I0125 05:48:48.671936 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64b87774c4-k6hrl" Jan 25 05:48:48 crc kubenswrapper[4728]: I0125 05:48:48.713694 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ntzq4"] Jan 25 05:48:58 crc kubenswrapper[4728]: I0125 05:48:58.516521 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-528j4" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.040778 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29"] Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.042488 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.043964 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.049393 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29"] Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.103705 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.103764 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj56r\" (UniqueName: \"kubernetes.io/projected/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-kube-api-access-cj56r\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.103873 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.204796 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.204871 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.204897 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj56r\" (UniqueName: \"kubernetes.io/projected/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-kube-api-access-cj56r\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.205268 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.205366 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.222931 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj56r\" (UniqueName: \"kubernetes.io/projected/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-kube-api-access-cj56r\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.355345 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.693761 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29"] Jan 25 05:49:09 crc kubenswrapper[4728]: I0125 05:49:09.780767 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" event={"ID":"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8","Type":"ContainerStarted","Data":"286f205a0fff9b3e78036813c39acffdb89661921536b3d4dcac368191c7f112"} Jan 25 05:49:10 crc kubenswrapper[4728]: I0125 05:49:10.786775 4728 generic.go:334] "Generic (PLEG): container finished" podID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerID="9aaf51e8551e9ffc20819d0a2c1344011f7163cac64e059444d8f96b39e7a63d" exitCode=0 Jan 25 05:49:10 crc kubenswrapper[4728]: I0125 05:49:10.786831 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" event={"ID":"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8","Type":"ContainerDied","Data":"9aaf51e8551e9ffc20819d0a2c1344011f7163cac64e059444d8f96b39e7a63d"} Jan 25 05:49:12 crc kubenswrapper[4728]: I0125 05:49:12.796341 4728 generic.go:334] "Generic (PLEG): container finished" podID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerID="0f28f914a83dcd87ca73f4658397f113046edc64e69db28d24bc77b71fa24240" exitCode=0 Jan 25 05:49:12 crc kubenswrapper[4728]: I0125 05:49:12.796481 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" event={"ID":"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8","Type":"ContainerDied","Data":"0f28f914a83dcd87ca73f4658397f113046edc64e69db28d24bc77b71fa24240"} Jan 25 05:49:12 crc kubenswrapper[4728]: I0125 05:49:12.899571 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:49:12 crc kubenswrapper[4728]: I0125 05:49:12.899612 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:49:12 crc kubenswrapper[4728]: I0125 05:49:12.899649 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:49:12 crc kubenswrapper[4728]: I0125 05:49:12.900199 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4869a4031b431cc23a01935621e0bf0cd63107d4d6edf2fef74234f9435dad57"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 05:49:12 crc kubenswrapper[4728]: I0125 05:49:12.900248 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://4869a4031b431cc23a01935621e0bf0cd63107d4d6edf2fef74234f9435dad57" gracePeriod=600 Jan 25 05:49:13 crc kubenswrapper[4728]: I0125 05:49:13.743354 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ntzq4" podUID="607307b0-b3c9-4a00-9347-a299a689c1c8" containerName="console" containerID="cri-o://cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c" gracePeriod=15 Jan 25 05:49:13 crc kubenswrapper[4728]: I0125 05:49:13.803646 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="4869a4031b431cc23a01935621e0bf0cd63107d4d6edf2fef74234f9435dad57" exitCode=0 Jan 25 05:49:13 crc kubenswrapper[4728]: I0125 05:49:13.803746 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"4869a4031b431cc23a01935621e0bf0cd63107d4d6edf2fef74234f9435dad57"} Jan 25 05:49:13 crc kubenswrapper[4728]: I0125 05:49:13.803980 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"5b2523b29483494490949dbc53c4bdb9d3c9b4b7a93fe4055f11cc91a7d873b4"} Jan 25 05:49:13 crc kubenswrapper[4728]: I0125 05:49:13.804008 4728 scope.go:117] "RemoveContainer" containerID="9c6bd49d9b17f994e00e405d6b8f16b6edd37de171ad4d27462fcbdcfc065a69" Jan 25 05:49:13 crc kubenswrapper[4728]: I0125 05:49:13.805966 4728 generic.go:334] "Generic (PLEG): container finished" podID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerID="81bad0e270d2a70c776654fb47c5c9b252fc280ee0c0f5793ae325e36ee97342" exitCode=0 Jan 25 05:49:13 crc kubenswrapper[4728]: I0125 05:49:13.806018 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" event={"ID":"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8","Type":"ContainerDied","Data":"81bad0e270d2a70c776654fb47c5c9b252fc280ee0c0f5793ae325e36ee97342"} Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.053165 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ntzq4_607307b0-b3c9-4a00-9347-a299a689c1c8/console/0.log" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.053234 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.157466 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-console-config\") pod \"607307b0-b3c9-4a00-9347-a299a689c1c8\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.157517 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-oauth-serving-cert\") pod \"607307b0-b3c9-4a00-9347-a299a689c1c8\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.157545 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-service-ca\") pod \"607307b0-b3c9-4a00-9347-a299a689c1c8\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.157640 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vs94\" (UniqueName: \"kubernetes.io/projected/607307b0-b3c9-4a00-9347-a299a689c1c8-kube-api-access-8vs94\") pod \"607307b0-b3c9-4a00-9347-a299a689c1c8\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.157660 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-trusted-ca-bundle\") pod \"607307b0-b3c9-4a00-9347-a299a689c1c8\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.157691 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-serving-cert\") pod \"607307b0-b3c9-4a00-9347-a299a689c1c8\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.157727 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-oauth-config\") pod \"607307b0-b3c9-4a00-9347-a299a689c1c8\" (UID: \"607307b0-b3c9-4a00-9347-a299a689c1c8\") " Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.158145 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-console-config" (OuterVolumeSpecName: "console-config") pod "607307b0-b3c9-4a00-9347-a299a689c1c8" (UID: "607307b0-b3c9-4a00-9347-a299a689c1c8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.158450 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "607307b0-b3c9-4a00-9347-a299a689c1c8" (UID: "607307b0-b3c9-4a00-9347-a299a689c1c8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.158538 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-service-ca" (OuterVolumeSpecName: "service-ca") pod "607307b0-b3c9-4a00-9347-a299a689c1c8" (UID: "607307b0-b3c9-4a00-9347-a299a689c1c8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.158565 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "607307b0-b3c9-4a00-9347-a299a689c1c8" (UID: "607307b0-b3c9-4a00-9347-a299a689c1c8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.163721 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/607307b0-b3c9-4a00-9347-a299a689c1c8-kube-api-access-8vs94" (OuterVolumeSpecName: "kube-api-access-8vs94") pod "607307b0-b3c9-4a00-9347-a299a689c1c8" (UID: "607307b0-b3c9-4a00-9347-a299a689c1c8"). InnerVolumeSpecName "kube-api-access-8vs94". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.167050 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "607307b0-b3c9-4a00-9347-a299a689c1c8" (UID: "607307b0-b3c9-4a00-9347-a299a689c1c8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.169961 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "607307b0-b3c9-4a00-9347-a299a689c1c8" (UID: "607307b0-b3c9-4a00-9347-a299a689c1c8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.258788 4728 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-console-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.258885 4728 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.258939 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-service-ca\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.259008 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vs94\" (UniqueName: \"kubernetes.io/projected/607307b0-b3c9-4a00-9347-a299a689c1c8-kube-api-access-8vs94\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.259061 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607307b0-b3c9-4a00-9347-a299a689c1c8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.259110 4728 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.259155 4728 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/607307b0-b3c9-4a00-9347-a299a689c1c8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.815166 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ntzq4_607307b0-b3c9-4a00-9347-a299a689c1c8/console/0.log" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.815499 4728 generic.go:334] "Generic (PLEG): container finished" podID="607307b0-b3c9-4a00-9347-a299a689c1c8" containerID="cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c" exitCode=2 Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.815566 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ntzq4" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.815607 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ntzq4" event={"ID":"607307b0-b3c9-4a00-9347-a299a689c1c8","Type":"ContainerDied","Data":"cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c"} Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.815659 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ntzq4" event={"ID":"607307b0-b3c9-4a00-9347-a299a689c1c8","Type":"ContainerDied","Data":"e6d8128d4e4fc874f92f75806092eeab8f00e39fb32b8662f69daefc2dfe2060"} Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.815677 4728 scope.go:117] "RemoveContainer" containerID="cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.828991 4728 scope.go:117] "RemoveContainer" containerID="cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c" Jan 25 05:49:14 crc kubenswrapper[4728]: E0125 05:49:14.829283 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c\": container with ID starting with cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c not found: ID does not exist" containerID="cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.829310 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c"} err="failed to get container status \"cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c\": rpc error: code = NotFound desc = could not find container \"cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c\": container with ID starting with cb9cf414d293413c1bae3c0335af91d9c01a82a1afee2ff2835acaf8cfebf60c not found: ID does not exist" Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.838755 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ntzq4"] Jan 25 05:49:14 crc kubenswrapper[4728]: I0125 05:49:14.840760 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ntzq4"] Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.002913 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.168138 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-bundle\") pod \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.168200 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-util\") pod \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.168234 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj56r\" (UniqueName: \"kubernetes.io/projected/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-kube-api-access-cj56r\") pod \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\" (UID: \"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8\") " Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.169179 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-bundle" (OuterVolumeSpecName: "bundle") pod "ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" (UID: "ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.173614 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-kube-api-access-cj56r" (OuterVolumeSpecName: "kube-api-access-cj56r") pod "ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" (UID: "ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8"). InnerVolumeSpecName "kube-api-access-cj56r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.178290 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-util" (OuterVolumeSpecName: "util") pod "ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" (UID: "ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.268980 4728 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-util\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.269010 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj56r\" (UniqueName: \"kubernetes.io/projected/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-kube-api-access-cj56r\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.269022 4728 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.334194 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="607307b0-b3c9-4a00-9347-a299a689c1c8" path="/var/lib/kubelet/pods/607307b0-b3c9-4a00-9347-a299a689c1c8/volumes" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.822211 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" event={"ID":"ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8","Type":"ContainerDied","Data":"286f205a0fff9b3e78036813c39acffdb89661921536b3d4dcac368191c7f112"} Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.822257 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286f205a0fff9b3e78036813c39acffdb89661921536b3d4dcac368191c7f112" Jan 25 05:49:15 crc kubenswrapper[4728]: I0125 05:49:15.822226 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.910804 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg"] Jan 25 05:49:30 crc kubenswrapper[4728]: E0125 05:49:30.911497 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerName="pull" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.911509 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerName="pull" Jan 25 05:49:30 crc kubenswrapper[4728]: E0125 05:49:30.911523 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerName="extract" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.911529 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerName="extract" Jan 25 05:49:30 crc kubenswrapper[4728]: E0125 05:49:30.911536 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607307b0-b3c9-4a00-9347-a299a689c1c8" containerName="console" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.911542 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="607307b0-b3c9-4a00-9347-a299a689c1c8" containerName="console" Jan 25 05:49:30 crc kubenswrapper[4728]: E0125 05:49:30.911553 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerName="util" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.911559 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerName="util" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.911645 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="607307b0-b3c9-4a00-9347-a299a689c1c8" containerName="console" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.911655 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8" containerName="extract" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.912014 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.913910 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.913969 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-8wxh4" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.914413 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.920160 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg"] Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.920452 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.924248 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.940649 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvw8v\" (UniqueName: \"kubernetes.io/projected/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-kube-api-access-lvw8v\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.940695 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-webhook-cert\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:30 crc kubenswrapper[4728]: I0125 05:49:30.940730 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-apiservice-cert\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.041487 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvw8v\" (UniqueName: \"kubernetes.io/projected/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-kube-api-access-lvw8v\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.041537 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-webhook-cert\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.041573 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-apiservice-cert\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.046948 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-apiservice-cert\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.048669 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-webhook-cert\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.055173 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvw8v\" (UniqueName: \"kubernetes.io/projected/3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef-kube-api-access-lvw8v\") pod \"metallb-operator-controller-manager-5b95c97db5-42zxg\" (UID: \"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef\") " pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.226191 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.253841 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k"] Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.254692 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.257561 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.257766 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.261570 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zzbcl" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.262414 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k"] Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.351703 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857gf\" (UniqueName: \"kubernetes.io/projected/0acd04dc-1416-47ec-97a0-f999c55e5efb-kube-api-access-857gf\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.351788 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0acd04dc-1416-47ec-97a0-f999c55e5efb-webhook-cert\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.351815 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0acd04dc-1416-47ec-97a0-f999c55e5efb-apiservice-cert\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.452109 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0acd04dc-1416-47ec-97a0-f999c55e5efb-webhook-cert\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.452149 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0acd04dc-1416-47ec-97a0-f999c55e5efb-apiservice-cert\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.452179 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-857gf\" (UniqueName: \"kubernetes.io/projected/0acd04dc-1416-47ec-97a0-f999c55e5efb-kube-api-access-857gf\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.456175 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0acd04dc-1416-47ec-97a0-f999c55e5efb-apiservice-cert\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.457135 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0acd04dc-1416-47ec-97a0-f999c55e5efb-webhook-cert\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.469396 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-857gf\" (UniqueName: \"kubernetes.io/projected/0acd04dc-1416-47ec-97a0-f999c55e5efb-kube-api-access-857gf\") pod \"metallb-operator-webhook-server-6bb64544-ttk7k\" (UID: \"0acd04dc-1416-47ec-97a0-f999c55e5efb\") " pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.587020 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.612115 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg"] Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.912310 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" event={"ID":"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef","Type":"ContainerStarted","Data":"cad822ee34bb332abf955588fc65250fd46b3615941912c7409df3c94db1662e"} Jan 25 05:49:31 crc kubenswrapper[4728]: I0125 05:49:31.939823 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k"] Jan 25 05:49:31 crc kubenswrapper[4728]: W0125 05:49:31.946039 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0acd04dc_1416_47ec_97a0_f999c55e5efb.slice/crio-ccae5c10aa0c39ea3c85931e0dbcf9adb91e5b0e81df3447290aa4bb7a97ceaf WatchSource:0}: Error finding container ccae5c10aa0c39ea3c85931e0dbcf9adb91e5b0e81df3447290aa4bb7a97ceaf: Status 404 returned error can't find the container with id ccae5c10aa0c39ea3c85931e0dbcf9adb91e5b0e81df3447290aa4bb7a97ceaf Jan 25 05:49:32 crc kubenswrapper[4728]: I0125 05:49:32.920267 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" event={"ID":"0acd04dc-1416-47ec-97a0-f999c55e5efb","Type":"ContainerStarted","Data":"ccae5c10aa0c39ea3c85931e0dbcf9adb91e5b0e81df3447290aa4bb7a97ceaf"} Jan 25 05:49:35 crc kubenswrapper[4728]: I0125 05:49:35.940503 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" event={"ID":"0acd04dc-1416-47ec-97a0-f999c55e5efb","Type":"ContainerStarted","Data":"f7f526ee0031935e39a4ce103b4cf55a6998bdb669c23d1ec87aaf4bd9475dc1"} Jan 25 05:49:35 crc kubenswrapper[4728]: I0125 05:49:35.940952 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:49:35 crc kubenswrapper[4728]: I0125 05:49:35.959224 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" podStartSLOduration=1.573758864 podStartE2EDuration="4.959189194s" podCreationTimestamp="2026-01-25 05:49:31 +0000 UTC" firstStartedPulling="2026-01-25 05:49:31.949558878 +0000 UTC m=+662.985436859" lastFinishedPulling="2026-01-25 05:49:35.334989209 +0000 UTC m=+666.370867189" observedRunningTime="2026-01-25 05:49:35.956973906 +0000 UTC m=+666.992851876" watchObservedRunningTime="2026-01-25 05:49:35.959189194 +0000 UTC m=+666.995067194" Jan 25 05:49:36 crc kubenswrapper[4728]: I0125 05:49:36.947411 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" event={"ID":"3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef","Type":"ContainerStarted","Data":"f61e49da8a6a0b6debb3f9d683921dd3c4f9674ae989a3efbe3d53bda240c501"} Jan 25 05:49:36 crc kubenswrapper[4728]: I0125 05:49:36.947605 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:49:36 crc kubenswrapper[4728]: I0125 05:49:36.965271 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" podStartSLOduration=2.359648289 podStartE2EDuration="6.965253885s" podCreationTimestamp="2026-01-25 05:49:30 +0000 UTC" firstStartedPulling="2026-01-25 05:49:31.621934896 +0000 UTC m=+662.657812876" lastFinishedPulling="2026-01-25 05:49:36.227540493 +0000 UTC m=+667.263418472" observedRunningTime="2026-01-25 05:49:36.962772336 +0000 UTC m=+667.998650315" watchObservedRunningTime="2026-01-25 05:49:36.965253885 +0000 UTC m=+668.001131865" Jan 25 05:49:51 crc kubenswrapper[4728]: I0125 05:49:51.591772 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6bb64544-ttk7k" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.229385 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5b95c97db5-42zxg" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.835212 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-722sp"] Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.837460 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.837855 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8"] Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.838634 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.840936 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4sjjf" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.841684 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.841754 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.842260 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.850606 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8"] Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.912039 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zrgnp"] Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.912990 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zrgnp" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.914785 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.915073 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.915208 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-bhvwq" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.916090 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.922486 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-fdm97"] Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.923381 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.924674 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.930175 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-fdm97"] Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980072 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a77d36cb-ae0e-41c8-98be-85563d52e02c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-9r8h8\" (UID: \"a77d36cb-ae0e-41c8-98be-85563d52e02c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980218 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-sockets\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980372 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-conf\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980482 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-metrics\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980540 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-reloader\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980612 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjqtw\" (UniqueName: \"kubernetes.io/projected/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-kube-api-access-jjqtw\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980732 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brz6c\" (UniqueName: \"kubernetes.io/projected/a77d36cb-ae0e-41c8-98be-85563d52e02c-kube-api-access-brz6c\") pod \"frr-k8s-webhook-server-7df86c4f6c-9r8h8\" (UID: \"a77d36cb-ae0e-41c8-98be-85563d52e02c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980765 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-metrics-certs\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:11 crc kubenswrapper[4728]: I0125 05:50:11.980797 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-startup\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.081970 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-metrics\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082017 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ctb\" (UniqueName: \"kubernetes.io/projected/44533cf6-d8b9-4376-8aad-372d74dbeecd-kube-api-access-97ctb\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082037 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-reloader\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082055 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjqtw\" (UniqueName: \"kubernetes.io/projected/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-kube-api-access-jjqtw\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082089 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdj2k\" (UniqueName: \"kubernetes.io/projected/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-kube-api-access-qdj2k\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082106 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brz6c\" (UniqueName: \"kubernetes.io/projected/a77d36cb-ae0e-41c8-98be-85563d52e02c-kube-api-access-brz6c\") pod \"frr-k8s-webhook-server-7df86c4f6c-9r8h8\" (UID: \"a77d36cb-ae0e-41c8-98be-85563d52e02c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082122 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-metrics-certs\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082142 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-startup\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082165 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082180 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-metrics-certs\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082201 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a77d36cb-ae0e-41c8-98be-85563d52e02c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-9r8h8\" (UID: \"a77d36cb-ae0e-41c8-98be-85563d52e02c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082231 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-metrics-certs\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082251 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/44533cf6-d8b9-4376-8aad-372d74dbeecd-metallb-excludel2\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082269 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-sockets\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082284 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-cert\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082307 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-conf\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082428 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-metrics\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082429 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-reloader\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082606 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-conf\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.082935 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-sockets\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.083187 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-frr-startup\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.087620 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a77d36cb-ae0e-41c8-98be-85563d52e02c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-9r8h8\" (UID: \"a77d36cb-ae0e-41c8-98be-85563d52e02c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.087630 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-metrics-certs\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.097705 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjqtw\" (UniqueName: \"kubernetes.io/projected/8730e55e-04b1-4da0-acc2-3f2ca701ba6a-kube-api-access-jjqtw\") pod \"frr-k8s-722sp\" (UID: \"8730e55e-04b1-4da0-acc2-3f2ca701ba6a\") " pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.097718 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brz6c\" (UniqueName: \"kubernetes.io/projected/a77d36cb-ae0e-41c8-98be-85563d52e02c-kube-api-access-brz6c\") pod \"frr-k8s-webhook-server-7df86c4f6c-9r8h8\" (UID: \"a77d36cb-ae0e-41c8-98be-85563d52e02c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.152777 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.158353 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.183363 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-cert\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.183423 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ctb\" (UniqueName: \"kubernetes.io/projected/44533cf6-d8b9-4376-8aad-372d74dbeecd-kube-api-access-97ctb\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.183459 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdj2k\" (UniqueName: \"kubernetes.io/projected/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-kube-api-access-qdj2k\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.183486 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-metrics-certs\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.183502 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.183535 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-metrics-certs\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.183552 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/44533cf6-d8b9-4376-8aad-372d74dbeecd-metallb-excludel2\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.184834 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/44533cf6-d8b9-4376-8aad-372d74dbeecd-metallb-excludel2\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: E0125 05:50:12.185249 4728 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 25 05:50:12 crc kubenswrapper[4728]: E0125 05:50:12.185285 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist podName:44533cf6-d8b9-4376-8aad-372d74dbeecd nodeName:}" failed. No retries permitted until 2026-01-25 05:50:12.68527271 +0000 UTC m=+703.721150690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist") pod "speaker-zrgnp" (UID: "44533cf6-d8b9-4376-8aad-372d74dbeecd") : secret "metallb-memberlist" not found Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.187732 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-cert\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.190981 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-metrics-certs\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.195398 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-metrics-certs\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.196776 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdj2k\" (UniqueName: \"kubernetes.io/projected/d9fe7f50-6608-4a79-81f9-bdf8290d9d90-kube-api-access-qdj2k\") pod \"controller-6968d8fdc4-fdm97\" (UID: \"d9fe7f50-6608-4a79-81f9-bdf8290d9d90\") " pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.199309 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ctb\" (UniqueName: \"kubernetes.io/projected/44533cf6-d8b9-4376-8aad-372d74dbeecd-kube-api-access-97ctb\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.232691 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.515431 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8"] Jan 25 05:50:12 crc kubenswrapper[4728]: W0125 05:50:12.519754 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda77d36cb_ae0e_41c8_98be_85563d52e02c.slice/crio-64ed7583a170b8a9f97a3d91e79a228da906ab6d5f1b46f82a7383d836e040ec WatchSource:0}: Error finding container 64ed7583a170b8a9f97a3d91e79a228da906ab6d5f1b46f82a7383d836e040ec: Status 404 returned error can't find the container with id 64ed7583a170b8a9f97a3d91e79a228da906ab6d5f1b46f82a7383d836e040ec Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.585063 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-fdm97"] Jan 25 05:50:12 crc kubenswrapper[4728]: W0125 05:50:12.587098 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9fe7f50_6608_4a79_81f9_bdf8290d9d90.slice/crio-b4d00cf73d7dd41fbb6796bb974511b881b1f0783167e94e7587f8f9c204cdb7 WatchSource:0}: Error finding container b4d00cf73d7dd41fbb6796bb974511b881b1f0783167e94e7587f8f9c204cdb7: Status 404 returned error can't find the container with id b4d00cf73d7dd41fbb6796bb974511b881b1f0783167e94e7587f8f9c204cdb7 Jan 25 05:50:12 crc kubenswrapper[4728]: I0125 05:50:12.691370 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:12 crc kubenswrapper[4728]: E0125 05:50:12.691570 4728 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 25 05:50:12 crc kubenswrapper[4728]: E0125 05:50:12.691671 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist podName:44533cf6-d8b9-4376-8aad-372d74dbeecd nodeName:}" failed. No retries permitted until 2026-01-25 05:50:13.691643608 +0000 UTC m=+704.727521588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist") pod "speaker-zrgnp" (UID: "44533cf6-d8b9-4376-8aad-372d74dbeecd") : secret "metallb-memberlist" not found Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.139334 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-fdm97" event={"ID":"d9fe7f50-6608-4a79-81f9-bdf8290d9d90","Type":"ContainerStarted","Data":"8131a16568026b47f0a0e625e956d4014c7988bd42b19eadebb8ba39c4b3905a"} Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.139737 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-fdm97" event={"ID":"d9fe7f50-6608-4a79-81f9-bdf8290d9d90","Type":"ContainerStarted","Data":"546d7c9dcdb13db2f88cffa15a39ec7becd2c450476068365676d3298da0d22a"} Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.139767 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-fdm97" event={"ID":"d9fe7f50-6608-4a79-81f9-bdf8290d9d90","Type":"ContainerStarted","Data":"b4d00cf73d7dd41fbb6796bb974511b881b1f0783167e94e7587f8f9c204cdb7"} Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.139789 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.140936 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerStarted","Data":"00a1486f85c9406e35d3cb542861933948a4b055d1ac92d36a5a98fe211cbb87"} Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.141912 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" event={"ID":"a77d36cb-ae0e-41c8-98be-85563d52e02c","Type":"ContainerStarted","Data":"64ed7583a170b8a9f97a3d91e79a228da906ab6d5f1b46f82a7383d836e040ec"} Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.704311 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.709547 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44533cf6-d8b9-4376-8aad-372d74dbeecd-memberlist\") pod \"speaker-zrgnp\" (UID: \"44533cf6-d8b9-4376-8aad-372d74dbeecd\") " pod="metallb-system/speaker-zrgnp" Jan 25 05:50:13 crc kubenswrapper[4728]: I0125 05:50:13.723922 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zrgnp" Jan 25 05:50:13 crc kubenswrapper[4728]: W0125 05:50:13.741824 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44533cf6_d8b9_4376_8aad_372d74dbeecd.slice/crio-3defa34eae857bca9d623e5e7939cb14d860d93f97f5c4b8e6acdf8395b84a29 WatchSource:0}: Error finding container 3defa34eae857bca9d623e5e7939cb14d860d93f97f5c4b8e6acdf8395b84a29: Status 404 returned error can't find the container with id 3defa34eae857bca9d623e5e7939cb14d860d93f97f5c4b8e6acdf8395b84a29 Jan 25 05:50:14 crc kubenswrapper[4728]: I0125 05:50:14.155973 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zrgnp" event={"ID":"44533cf6-d8b9-4376-8aad-372d74dbeecd","Type":"ContainerStarted","Data":"dd8f65bf5dac80d614140b29cae82bc5e82cca08538377c9e035ae2913a751ec"} Jan 25 05:50:14 crc kubenswrapper[4728]: I0125 05:50:14.156622 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zrgnp" event={"ID":"44533cf6-d8b9-4376-8aad-372d74dbeecd","Type":"ContainerStarted","Data":"3defa34eae857bca9d623e5e7939cb14d860d93f97f5c4b8e6acdf8395b84a29"} Jan 25 05:50:15 crc kubenswrapper[4728]: I0125 05:50:15.165158 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zrgnp" event={"ID":"44533cf6-d8b9-4376-8aad-372d74dbeecd","Type":"ContainerStarted","Data":"4277a9a4ba1fca23b14d242532e6452d082ad2a3f2a509dbe900bc6c386d598b"} Jan 25 05:50:15 crc kubenswrapper[4728]: I0125 05:50:15.165362 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zrgnp" Jan 25 05:50:15 crc kubenswrapper[4728]: I0125 05:50:15.177577 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-fdm97" podStartSLOduration=4.177567256 podStartE2EDuration="4.177567256s" podCreationTimestamp="2026-01-25 05:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:50:13.155705917 +0000 UTC m=+704.191583888" watchObservedRunningTime="2026-01-25 05:50:15.177567256 +0000 UTC m=+706.213445235" Jan 25 05:50:15 crc kubenswrapper[4728]: I0125 05:50:15.178244 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zrgnp" podStartSLOduration=4.178227351 podStartE2EDuration="4.178227351s" podCreationTimestamp="2026-01-25 05:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:50:15.177296385 +0000 UTC m=+706.213174364" watchObservedRunningTime="2026-01-25 05:50:15.178227351 +0000 UTC m=+706.214105331" Jan 25 05:50:19 crc kubenswrapper[4728]: I0125 05:50:19.192817 4728 generic.go:334] "Generic (PLEG): container finished" podID="8730e55e-04b1-4da0-acc2-3f2ca701ba6a" containerID="b7d64dc872208d4b7ea1e607ef5d39fe2a1e6fda13199793459dd5cc85f2fad7" exitCode=0 Jan 25 05:50:19 crc kubenswrapper[4728]: I0125 05:50:19.193535 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerDied","Data":"b7d64dc872208d4b7ea1e607ef5d39fe2a1e6fda13199793459dd5cc85f2fad7"} Jan 25 05:50:19 crc kubenswrapper[4728]: I0125 05:50:19.195241 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" event={"ID":"a77d36cb-ae0e-41c8-98be-85563d52e02c","Type":"ContainerStarted","Data":"5f71a5e1d8548523a05fca0d4019e3088745232445d736967913e08d4fc3267d"} Jan 25 05:50:19 crc kubenswrapper[4728]: I0125 05:50:19.195731 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:19 crc kubenswrapper[4728]: I0125 05:50:19.234839 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" podStartSLOduration=2.358875126 podStartE2EDuration="8.234825693s" podCreationTimestamp="2026-01-25 05:50:11 +0000 UTC" firstStartedPulling="2026-01-25 05:50:12.521474701 +0000 UTC m=+703.557352682" lastFinishedPulling="2026-01-25 05:50:18.397425268 +0000 UTC m=+709.433303249" observedRunningTime="2026-01-25 05:50:19.231005547 +0000 UTC m=+710.266883528" watchObservedRunningTime="2026-01-25 05:50:19.234825693 +0000 UTC m=+710.270703673" Jan 25 05:50:20 crc kubenswrapper[4728]: I0125 05:50:20.207208 4728 generic.go:334] "Generic (PLEG): container finished" podID="8730e55e-04b1-4da0-acc2-3f2ca701ba6a" containerID="6e64644979134c99997ded11a06df943f13de5277aa6217cc9b43dac3a131235" exitCode=0 Jan 25 05:50:20 crc kubenswrapper[4728]: I0125 05:50:20.207267 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerDied","Data":"6e64644979134c99997ded11a06df943f13de5277aa6217cc9b43dac3a131235"} Jan 25 05:50:21 crc kubenswrapper[4728]: I0125 05:50:21.214427 4728 generic.go:334] "Generic (PLEG): container finished" podID="8730e55e-04b1-4da0-acc2-3f2ca701ba6a" containerID="be68d7bdfb23f29ccd5a5c79a4734b6decb4d50c6551c097ca659d2bcab3dd72" exitCode=0 Jan 25 05:50:21 crc kubenswrapper[4728]: I0125 05:50:21.214528 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerDied","Data":"be68d7bdfb23f29ccd5a5c79a4734b6decb4d50c6551c097ca659d2bcab3dd72"} Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.224755 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerStarted","Data":"2d387eb341ba8055c98e55429c0a8ed9717ce21c7b0a27e6c6ba6c5a7dd94e4f"} Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.225152 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerStarted","Data":"8d4f93fff2505546cf7e5368ef6cf2be072bda69ef64c5854316c98ca6350b9a"} Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.225164 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerStarted","Data":"e5c1dc474441b458e3b3595a70265e3941b8f23af715e69ce494cb422c8a4e52"} Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.225178 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.225189 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerStarted","Data":"6b50f2ecf9d8707366d2524d1debaba45102b6a17e151f27e43349c48f02318f"} Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.225198 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerStarted","Data":"34e27215e8320588032f5c39abd224d9c3a41e6ecd4f5e877c2324a803d4f40a"} Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.225207 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-722sp" event={"ID":"8730e55e-04b1-4da0-acc2-3f2ca701ba6a","Type":"ContainerStarted","Data":"6a22abebd0198401eefc8b66c051d67d9adae6a697a89b88fb8f7566d2ae069e"} Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.236362 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-fdm97" Jan 25 05:50:22 crc kubenswrapper[4728]: I0125 05:50:22.253855 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-722sp" podStartSLOduration=5.133701182 podStartE2EDuration="11.253837751s" podCreationTimestamp="2026-01-25 05:50:11 +0000 UTC" firstStartedPulling="2026-01-25 05:50:12.29539966 +0000 UTC m=+703.331277640" lastFinishedPulling="2026-01-25 05:50:18.415536228 +0000 UTC m=+709.451414209" observedRunningTime="2026-01-25 05:50:22.247595798 +0000 UTC m=+713.283473778" watchObservedRunningTime="2026-01-25 05:50:22.253837751 +0000 UTC m=+713.289715730" Jan 25 05:50:23 crc kubenswrapper[4728]: I0125 05:50:23.729967 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zrgnp" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.114365 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-c6dnb"] Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.115567 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c6dnb" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.118893 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-n5chs" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.119143 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.119424 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.128878 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c6dnb"] Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.269271 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdcpm\" (UniqueName: \"kubernetes.io/projected/8235b557-0829-47e3-925d-a9eae9b3ae67-kube-api-access-xdcpm\") pod \"openstack-operator-index-c6dnb\" (UID: \"8235b557-0829-47e3-925d-a9eae9b3ae67\") " pod="openstack-operators/openstack-operator-index-c6dnb" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.370458 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdcpm\" (UniqueName: \"kubernetes.io/projected/8235b557-0829-47e3-925d-a9eae9b3ae67-kube-api-access-xdcpm\") pod \"openstack-operator-index-c6dnb\" (UID: \"8235b557-0829-47e3-925d-a9eae9b3ae67\") " pod="openstack-operators/openstack-operator-index-c6dnb" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.386799 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdcpm\" (UniqueName: \"kubernetes.io/projected/8235b557-0829-47e3-925d-a9eae9b3ae67-kube-api-access-xdcpm\") pod \"openstack-operator-index-c6dnb\" (UID: \"8235b557-0829-47e3-925d-a9eae9b3ae67\") " pod="openstack-operators/openstack-operator-index-c6dnb" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.431673 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c6dnb" Jan 25 05:50:26 crc kubenswrapper[4728]: I0125 05:50:26.780780 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c6dnb"] Jan 25 05:50:26 crc kubenswrapper[4728]: W0125 05:50:26.785925 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8235b557_0829_47e3_925d_a9eae9b3ae67.slice/crio-5b4fa60fd9c5ad655d5759f1dfc99456dd842e01533f5dfccc3a0a85345ed389 WatchSource:0}: Error finding container 5b4fa60fd9c5ad655d5759f1dfc99456dd842e01533f5dfccc3a0a85345ed389: Status 404 returned error can't find the container with id 5b4fa60fd9c5ad655d5759f1dfc99456dd842e01533f5dfccc3a0a85345ed389 Jan 25 05:50:27 crc kubenswrapper[4728]: I0125 05:50:27.153942 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:27 crc kubenswrapper[4728]: I0125 05:50:27.187762 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:27 crc kubenswrapper[4728]: I0125 05:50:27.258501 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c6dnb" event={"ID":"8235b557-0829-47e3-925d-a9eae9b3ae67","Type":"ContainerStarted","Data":"5b4fa60fd9c5ad655d5759f1dfc99456dd842e01533f5dfccc3a0a85345ed389"} Jan 25 05:50:28 crc kubenswrapper[4728]: I0125 05:50:28.267479 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c6dnb" event={"ID":"8235b557-0829-47e3-925d-a9eae9b3ae67","Type":"ContainerStarted","Data":"3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305"} Jan 25 05:50:28 crc kubenswrapper[4728]: I0125 05:50:28.283460 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-c6dnb" podStartSLOduration=1.005477302 podStartE2EDuration="2.283446769s" podCreationTimestamp="2026-01-25 05:50:26 +0000 UTC" firstStartedPulling="2026-01-25 05:50:26.787882074 +0000 UTC m=+717.823760055" lastFinishedPulling="2026-01-25 05:50:28.065851542 +0000 UTC m=+719.101729522" observedRunningTime="2026-01-25 05:50:28.280098172 +0000 UTC m=+719.315976153" watchObservedRunningTime="2026-01-25 05:50:28.283446769 +0000 UTC m=+719.319324738" Jan 25 05:50:29 crc kubenswrapper[4728]: I0125 05:50:29.492888 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-c6dnb"] Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.098444 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wtmq9"] Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.099709 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.107759 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wtmq9"] Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.117687 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s46kt\" (UniqueName: \"kubernetes.io/projected/f0c09b75-dc3c-4fa8-b310-d95a41ba1564-kube-api-access-s46kt\") pod \"openstack-operator-index-wtmq9\" (UID: \"f0c09b75-dc3c-4fa8-b310-d95a41ba1564\") " pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.218810 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s46kt\" (UniqueName: \"kubernetes.io/projected/f0c09b75-dc3c-4fa8-b310-d95a41ba1564-kube-api-access-s46kt\") pod \"openstack-operator-index-wtmq9\" (UID: \"f0c09b75-dc3c-4fa8-b310-d95a41ba1564\") " pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.236446 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s46kt\" (UniqueName: \"kubernetes.io/projected/f0c09b75-dc3c-4fa8-b310-d95a41ba1564-kube-api-access-s46kt\") pod \"openstack-operator-index-wtmq9\" (UID: \"f0c09b75-dc3c-4fa8-b310-d95a41ba1564\") " pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.281612 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-c6dnb" podUID="8235b557-0829-47e3-925d-a9eae9b3ae67" containerName="registry-server" containerID="cri-o://3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305" gracePeriod=2 Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.412390 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.614160 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c6dnb" Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.727096 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdcpm\" (UniqueName: \"kubernetes.io/projected/8235b557-0829-47e3-925d-a9eae9b3ae67-kube-api-access-xdcpm\") pod \"8235b557-0829-47e3-925d-a9eae9b3ae67\" (UID: \"8235b557-0829-47e3-925d-a9eae9b3ae67\") " Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.733403 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8235b557-0829-47e3-925d-a9eae9b3ae67-kube-api-access-xdcpm" (OuterVolumeSpecName: "kube-api-access-xdcpm") pod "8235b557-0829-47e3-925d-a9eae9b3ae67" (UID: "8235b557-0829-47e3-925d-a9eae9b3ae67"). InnerVolumeSpecName "kube-api-access-xdcpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.803502 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wtmq9"] Jan 25 05:50:30 crc kubenswrapper[4728]: I0125 05:50:30.829089 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdcpm\" (UniqueName: \"kubernetes.io/projected/8235b557-0829-47e3-925d-a9eae9b3ae67-kube-api-access-xdcpm\") on node \"crc\" DevicePath \"\"" Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.289672 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtmq9" event={"ID":"f0c09b75-dc3c-4fa8-b310-d95a41ba1564","Type":"ContainerStarted","Data":"7105fbdaed2a9defe2dca61fcfd33c1a393e7ab17c54dd00231a1d7a3ae16d6d"} Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.291558 4728 generic.go:334] "Generic (PLEG): container finished" podID="8235b557-0829-47e3-925d-a9eae9b3ae67" containerID="3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305" exitCode=0 Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.291597 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c6dnb" event={"ID":"8235b557-0829-47e3-925d-a9eae9b3ae67","Type":"ContainerDied","Data":"3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305"} Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.291614 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c6dnb" event={"ID":"8235b557-0829-47e3-925d-a9eae9b3ae67","Type":"ContainerDied","Data":"5b4fa60fd9c5ad655d5759f1dfc99456dd842e01533f5dfccc3a0a85345ed389"} Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.291635 4728 scope.go:117] "RemoveContainer" containerID="3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305" Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.291640 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c6dnb" Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.313649 4728 scope.go:117] "RemoveContainer" containerID="3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305" Jan 25 05:50:31 crc kubenswrapper[4728]: E0125 05:50:31.314631 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305\": container with ID starting with 3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305 not found: ID does not exist" containerID="3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305" Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.314667 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305"} err="failed to get container status \"3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305\": rpc error: code = NotFound desc = could not find container \"3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305\": container with ID starting with 3ee21625eb64901f7f26d5bf3fe3349607ba412d3cca60683a13b1678dd7f305 not found: ID does not exist" Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.317627 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-c6dnb"] Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.320958 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-c6dnb"] Jan 25 05:50:31 crc kubenswrapper[4728]: I0125 05:50:31.335069 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8235b557-0829-47e3-925d-a9eae9b3ae67" path="/var/lib/kubelet/pods/8235b557-0829-47e3-925d-a9eae9b3ae67/volumes" Jan 25 05:50:32 crc kubenswrapper[4728]: I0125 05:50:32.156680 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-722sp" Jan 25 05:50:32 crc kubenswrapper[4728]: I0125 05:50:32.166892 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-9r8h8" Jan 25 05:50:32 crc kubenswrapper[4728]: I0125 05:50:32.307202 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtmq9" event={"ID":"f0c09b75-dc3c-4fa8-b310-d95a41ba1564","Type":"ContainerStarted","Data":"a6ba5b6600110be0f49c51062b87b3e0d728f67f70f894d72917a86327ef5456"} Jan 25 05:50:32 crc kubenswrapper[4728]: I0125 05:50:32.324518 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wtmq9" podStartSLOduration=1.833237365 podStartE2EDuration="2.324503042s" podCreationTimestamp="2026-01-25 05:50:30 +0000 UTC" firstStartedPulling="2026-01-25 05:50:30.812489023 +0000 UTC m=+721.848367002" lastFinishedPulling="2026-01-25 05:50:31.303754699 +0000 UTC m=+722.339632679" observedRunningTime="2026-01-25 05:50:32.321625645 +0000 UTC m=+723.357503624" watchObservedRunningTime="2026-01-25 05:50:32.324503042 +0000 UTC m=+723.360381023" Jan 25 05:50:40 crc kubenswrapper[4728]: I0125 05:50:40.413546 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:40 crc kubenswrapper[4728]: I0125 05:50:40.414763 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:40 crc kubenswrapper[4728]: I0125 05:50:40.439983 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:41 crc kubenswrapper[4728]: I0125 05:50:41.388144 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wtmq9" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.132147 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr"] Jan 25 05:50:42 crc kubenswrapper[4728]: E0125 05:50:42.132742 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8235b557-0829-47e3-925d-a9eae9b3ae67" containerName="registry-server" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.132759 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8235b557-0829-47e3-925d-a9eae9b3ae67" containerName="registry-server" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.132888 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8235b557-0829-47e3-925d-a9eae9b3ae67" containerName="registry-server" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.133773 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.135873 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w9kqf" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.137775 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr"] Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.263002 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-bundle\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.263153 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-util\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.263241 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9vk4\" (UniqueName: \"kubernetes.io/projected/4946a9cb-495f-442c-b77d-9ff84ce2b795-kube-api-access-h9vk4\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.364353 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-bundle\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.364402 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-util\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.364434 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9vk4\" (UniqueName: \"kubernetes.io/projected/4946a9cb-495f-442c-b77d-9ff84ce2b795-kube-api-access-h9vk4\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.364941 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-bundle\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.365638 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-util\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.382929 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9vk4\" (UniqueName: \"kubernetes.io/projected/4946a9cb-495f-442c-b77d-9ff84ce2b795-kube-api-access-h9vk4\") pod \"35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.447845 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:42 crc kubenswrapper[4728]: I0125 05:50:42.820257 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr"] Jan 25 05:50:43 crc kubenswrapper[4728]: I0125 05:50:43.385278 4728 generic.go:334] "Generic (PLEG): container finished" podID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerID="a77aa35081c95e5f0811ba62d87f4a39204a2b090727dece89bf344df1073194" exitCode=0 Jan 25 05:50:43 crc kubenswrapper[4728]: I0125 05:50:43.385448 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" event={"ID":"4946a9cb-495f-442c-b77d-9ff84ce2b795","Type":"ContainerDied","Data":"a77aa35081c95e5f0811ba62d87f4a39204a2b090727dece89bf344df1073194"} Jan 25 05:50:43 crc kubenswrapper[4728]: I0125 05:50:43.385604 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" event={"ID":"4946a9cb-495f-442c-b77d-9ff84ce2b795","Type":"ContainerStarted","Data":"0985fab1f28ba0761211ec331ab0cfa62f347f6bcf9b5e3c592d14423103f8fb"} Jan 25 05:50:45 crc kubenswrapper[4728]: I0125 05:50:45.403547 4728 generic.go:334] "Generic (PLEG): container finished" podID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerID="ffb9f1d61fc9f4b15f954fa7bf30d84ccf04391bc2c1c90cd7d643265f73f460" exitCode=0 Jan 25 05:50:45 crc kubenswrapper[4728]: I0125 05:50:45.403653 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" event={"ID":"4946a9cb-495f-442c-b77d-9ff84ce2b795","Type":"ContainerDied","Data":"ffb9f1d61fc9f4b15f954fa7bf30d84ccf04391bc2c1c90cd7d643265f73f460"} Jan 25 05:50:46 crc kubenswrapper[4728]: I0125 05:50:46.411179 4728 generic.go:334] "Generic (PLEG): container finished" podID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerID="3fc945c6a5ab18c782cf265aa6c2c6505bc1283a46252d5bb0f6405535baa3fb" exitCode=0 Jan 25 05:50:46 crc kubenswrapper[4728]: I0125 05:50:46.411273 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" event={"ID":"4946a9cb-495f-442c-b77d-9ff84ce2b795","Type":"ContainerDied","Data":"3fc945c6a5ab18c782cf265aa6c2c6505bc1283a46252d5bb0f6405535baa3fb"} Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.622480 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.730156 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-bundle\") pod \"4946a9cb-495f-442c-b77d-9ff84ce2b795\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.730767 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-bundle" (OuterVolumeSpecName: "bundle") pod "4946a9cb-495f-442c-b77d-9ff84ce2b795" (UID: "4946a9cb-495f-442c-b77d-9ff84ce2b795"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.730867 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-util\") pod \"4946a9cb-495f-442c-b77d-9ff84ce2b795\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.731031 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9vk4\" (UniqueName: \"kubernetes.io/projected/4946a9cb-495f-442c-b77d-9ff84ce2b795-kube-api-access-h9vk4\") pod \"4946a9cb-495f-442c-b77d-9ff84ce2b795\" (UID: \"4946a9cb-495f-442c-b77d-9ff84ce2b795\") " Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.731466 4728 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.736578 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4946a9cb-495f-442c-b77d-9ff84ce2b795-kube-api-access-h9vk4" (OuterVolumeSpecName: "kube-api-access-h9vk4") pod "4946a9cb-495f-442c-b77d-9ff84ce2b795" (UID: "4946a9cb-495f-442c-b77d-9ff84ce2b795"). InnerVolumeSpecName "kube-api-access-h9vk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.740355 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-util" (OuterVolumeSpecName: "util") pod "4946a9cb-495f-442c-b77d-9ff84ce2b795" (UID: "4946a9cb-495f-442c-b77d-9ff84ce2b795"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.831789 4728 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4946a9cb-495f-442c-b77d-9ff84ce2b795-util\") on node \"crc\" DevicePath \"\"" Jan 25 05:50:47 crc kubenswrapper[4728]: I0125 05:50:47.831813 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9vk4\" (UniqueName: \"kubernetes.io/projected/4946a9cb-495f-442c-b77d-9ff84ce2b795-kube-api-access-h9vk4\") on node \"crc\" DevicePath \"\"" Jan 25 05:50:48 crc kubenswrapper[4728]: I0125 05:50:48.425214 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" event={"ID":"4946a9cb-495f-442c-b77d-9ff84ce2b795","Type":"ContainerDied","Data":"0985fab1f28ba0761211ec331ab0cfa62f347f6bcf9b5e3c592d14423103f8fb"} Jan 25 05:50:48 crc kubenswrapper[4728]: I0125 05:50:48.425245 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr" Jan 25 05:50:48 crc kubenswrapper[4728]: I0125 05:50:48.425260 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0985fab1f28ba0761211ec331ab0cfa62f347f6bcf9b5e3c592d14423103f8fb" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.508539 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-f6799c556-rtg62"] Jan 25 05:50:54 crc kubenswrapper[4728]: E0125 05:50:54.509240 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerName="extract" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.509253 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerName="extract" Jan 25 05:50:54 crc kubenswrapper[4728]: E0125 05:50:54.509263 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerName="util" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.509268 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerName="util" Jan 25 05:50:54 crc kubenswrapper[4728]: E0125 05:50:54.509283 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerName="pull" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.509289 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerName="pull" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.509406 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4946a9cb-495f-442c-b77d-9ff84ce2b795" containerName="extract" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.509758 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.511953 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-z42k5" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.523536 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-f6799c556-rtg62"] Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.612351 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hp2\" (UniqueName: \"kubernetes.io/projected/2154b4ed-e610-40ed-8f77-cff0cf57d3a7-kube-api-access-78hp2\") pod \"openstack-operator-controller-init-f6799c556-rtg62\" (UID: \"2154b4ed-e610-40ed-8f77-cff0cf57d3a7\") " pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.714272 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78hp2\" (UniqueName: \"kubernetes.io/projected/2154b4ed-e610-40ed-8f77-cff0cf57d3a7-kube-api-access-78hp2\") pod \"openstack-operator-controller-init-f6799c556-rtg62\" (UID: \"2154b4ed-e610-40ed-8f77-cff0cf57d3a7\") " pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.738531 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hp2\" (UniqueName: \"kubernetes.io/projected/2154b4ed-e610-40ed-8f77-cff0cf57d3a7-kube-api-access-78hp2\") pod \"openstack-operator-controller-init-f6799c556-rtg62\" (UID: \"2154b4ed-e610-40ed-8f77-cff0cf57d3a7\") " pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.823354 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" Jan 25 05:50:54 crc kubenswrapper[4728]: I0125 05:50:54.998299 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-f6799c556-rtg62"] Jan 25 05:50:55 crc kubenswrapper[4728]: I0125 05:50:55.466627 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" event={"ID":"2154b4ed-e610-40ed-8f77-cff0cf57d3a7","Type":"ContainerStarted","Data":"d4d08b0a785ce3a6d113d07d665ff6354d5d587370eb3ab65ff75c9b76399325"} Jan 25 05:50:55 crc kubenswrapper[4728]: I0125 05:50:55.820631 4728 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 25 05:51:01 crc kubenswrapper[4728]: I0125 05:51:01.511160 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" event={"ID":"2154b4ed-e610-40ed-8f77-cff0cf57d3a7","Type":"ContainerStarted","Data":"b13d3722d8c0b4abb6f46c398ee2c8c307b791d0ec7d5a35ef7e6850944bbf76"} Jan 25 05:51:01 crc kubenswrapper[4728]: I0125 05:51:01.512031 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" Jan 25 05:51:01 crc kubenswrapper[4728]: I0125 05:51:01.539741 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" podStartSLOduration=1.669392694 podStartE2EDuration="7.539721838s" podCreationTimestamp="2026-01-25 05:50:54 +0000 UTC" firstStartedPulling="2026-01-25 05:50:55.00354564 +0000 UTC m=+746.039423619" lastFinishedPulling="2026-01-25 05:51:00.873874784 +0000 UTC m=+751.909752763" observedRunningTime="2026-01-25 05:51:01.534550153 +0000 UTC m=+752.570428133" watchObservedRunningTime="2026-01-25 05:51:01.539721838 +0000 UTC m=+752.575599817" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.543128 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-krbvk"] Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.544953 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.556686 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-krbvk"] Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.698867 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9n5c\" (UniqueName: \"kubernetes.io/projected/17440e8a-2d96-4467-966f-d0a36694775a-kube-api-access-x9n5c\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.698950 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-catalog-content\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.699116 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-utilities\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.801019 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-utilities\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.801136 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9n5c\" (UniqueName: \"kubernetes.io/projected/17440e8a-2d96-4467-966f-d0a36694775a-kube-api-access-x9n5c\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.801163 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-catalog-content\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.801639 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-catalog-content\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.801639 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-utilities\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.822109 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9n5c\" (UniqueName: \"kubernetes.io/projected/17440e8a-2d96-4467-966f-d0a36694775a-kube-api-access-x9n5c\") pod \"redhat-operators-krbvk\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:07 crc kubenswrapper[4728]: I0125 05:51:07.860426 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:08 crc kubenswrapper[4728]: I0125 05:51:08.047789 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-krbvk"] Jan 25 05:51:08 crc kubenswrapper[4728]: I0125 05:51:08.551352 4728 generic.go:334] "Generic (PLEG): container finished" podID="17440e8a-2d96-4467-966f-d0a36694775a" containerID="3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d" exitCode=0 Jan 25 05:51:08 crc kubenswrapper[4728]: I0125 05:51:08.551419 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krbvk" event={"ID":"17440e8a-2d96-4467-966f-d0a36694775a","Type":"ContainerDied","Data":"3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d"} Jan 25 05:51:08 crc kubenswrapper[4728]: I0125 05:51:08.552006 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krbvk" event={"ID":"17440e8a-2d96-4467-966f-d0a36694775a","Type":"ContainerStarted","Data":"06e692f611698d55b3c49b4d2faba81e35b6108cdc93af77e7711c066ac0a0d8"} Jan 25 05:51:09 crc kubenswrapper[4728]: I0125 05:51:09.559309 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krbvk" event={"ID":"17440e8a-2d96-4467-966f-d0a36694775a","Type":"ContainerStarted","Data":"46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c"} Jan 25 05:51:10 crc kubenswrapper[4728]: I0125 05:51:10.569225 4728 generic.go:334] "Generic (PLEG): container finished" podID="17440e8a-2d96-4467-966f-d0a36694775a" containerID="46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c" exitCode=0 Jan 25 05:51:10 crc kubenswrapper[4728]: I0125 05:51:10.569309 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krbvk" event={"ID":"17440e8a-2d96-4467-966f-d0a36694775a","Type":"ContainerDied","Data":"46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c"} Jan 25 05:51:11 crc kubenswrapper[4728]: I0125 05:51:11.578282 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krbvk" event={"ID":"17440e8a-2d96-4467-966f-d0a36694775a","Type":"ContainerStarted","Data":"d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84"} Jan 25 05:51:11 crc kubenswrapper[4728]: I0125 05:51:11.596718 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-krbvk" podStartSLOduration=2.088551661 podStartE2EDuration="4.596692754s" podCreationTimestamp="2026-01-25 05:51:07 +0000 UTC" firstStartedPulling="2026-01-25 05:51:08.565305596 +0000 UTC m=+759.601183576" lastFinishedPulling="2026-01-25 05:51:11.073446689 +0000 UTC m=+762.109324669" observedRunningTime="2026-01-25 05:51:11.593065392 +0000 UTC m=+762.628943372" watchObservedRunningTime="2026-01-25 05:51:11.596692754 +0000 UTC m=+762.632570734" Jan 25 05:51:14 crc kubenswrapper[4728]: I0125 05:51:14.826985 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-f6799c556-rtg62" Jan 25 05:51:17 crc kubenswrapper[4728]: I0125 05:51:17.861030 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:17 crc kubenswrapper[4728]: I0125 05:51:17.861415 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:17 crc kubenswrapper[4728]: I0125 05:51:17.905866 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:18 crc kubenswrapper[4728]: I0125 05:51:18.655642 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:18 crc kubenswrapper[4728]: I0125 05:51:18.695923 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-krbvk"] Jan 25 05:51:20 crc kubenswrapper[4728]: I0125 05:51:20.628148 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-krbvk" podUID="17440e8a-2d96-4467-966f-d0a36694775a" containerName="registry-server" containerID="cri-o://d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84" gracePeriod=2 Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.492103 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.637904 4728 generic.go:334] "Generic (PLEG): container finished" podID="17440e8a-2d96-4467-966f-d0a36694775a" containerID="d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84" exitCode=0 Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.637952 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krbvk" event={"ID":"17440e8a-2d96-4467-966f-d0a36694775a","Type":"ContainerDied","Data":"d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84"} Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.637981 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krbvk" event={"ID":"17440e8a-2d96-4467-966f-d0a36694775a","Type":"ContainerDied","Data":"06e692f611698d55b3c49b4d2faba81e35b6108cdc93af77e7711c066ac0a0d8"} Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.638000 4728 scope.go:117] "RemoveContainer" containerID="d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.638127 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-krbvk" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.652691 4728 scope.go:117] "RemoveContainer" containerID="46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.671907 4728 scope.go:117] "RemoveContainer" containerID="3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.681924 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9n5c\" (UniqueName: \"kubernetes.io/projected/17440e8a-2d96-4467-966f-d0a36694775a-kube-api-access-x9n5c\") pod \"17440e8a-2d96-4467-966f-d0a36694775a\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.681972 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-utilities\") pod \"17440e8a-2d96-4467-966f-d0a36694775a\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.682145 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-catalog-content\") pod \"17440e8a-2d96-4467-966f-d0a36694775a\" (UID: \"17440e8a-2d96-4467-966f-d0a36694775a\") " Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.682715 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-utilities" (OuterVolumeSpecName: "utilities") pod "17440e8a-2d96-4467-966f-d0a36694775a" (UID: "17440e8a-2d96-4467-966f-d0a36694775a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.690581 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17440e8a-2d96-4467-966f-d0a36694775a-kube-api-access-x9n5c" (OuterVolumeSpecName: "kube-api-access-x9n5c") pod "17440e8a-2d96-4467-966f-d0a36694775a" (UID: "17440e8a-2d96-4467-966f-d0a36694775a"). InnerVolumeSpecName "kube-api-access-x9n5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.698965 4728 scope.go:117] "RemoveContainer" containerID="d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84" Jan 25 05:51:21 crc kubenswrapper[4728]: E0125 05:51:21.699266 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84\": container with ID starting with d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84 not found: ID does not exist" containerID="d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.699298 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84"} err="failed to get container status \"d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84\": rpc error: code = NotFound desc = could not find container \"d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84\": container with ID starting with d90bbe5eb81e0ec024e77ffbc352364221ed5234c958a14a13a5ee9f5142bd84 not found: ID does not exist" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.699333 4728 scope.go:117] "RemoveContainer" containerID="46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c" Jan 25 05:51:21 crc kubenswrapper[4728]: E0125 05:51:21.699640 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c\": container with ID starting with 46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c not found: ID does not exist" containerID="46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.699697 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c"} err="failed to get container status \"46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c\": rpc error: code = NotFound desc = could not find container \"46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c\": container with ID starting with 46b80ca1202683f88a77d69502050db04ceb2b414177a0ec01a2f186060fcd2c not found: ID does not exist" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.699730 4728 scope.go:117] "RemoveContainer" containerID="3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d" Jan 25 05:51:21 crc kubenswrapper[4728]: E0125 05:51:21.700022 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d\": container with ID starting with 3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d not found: ID does not exist" containerID="3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.700050 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d"} err="failed to get container status \"3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d\": rpc error: code = NotFound desc = could not find container \"3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d\": container with ID starting with 3e1e9b318512743fbb53a3fa3fa748b40be91026eb8e34b30d5a104a43f2a81d not found: ID does not exist" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.783447 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9n5c\" (UniqueName: \"kubernetes.io/projected/17440e8a-2d96-4467-966f-d0a36694775a-kube-api-access-x9n5c\") on node \"crc\" DevicePath \"\"" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.783479 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.788159 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17440e8a-2d96-4467-966f-d0a36694775a" (UID: "17440e8a-2d96-4467-966f-d0a36694775a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.885129 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17440e8a-2d96-4467-966f-d0a36694775a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.959469 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-krbvk"] Jan 25 05:51:21 crc kubenswrapper[4728]: I0125 05:51:21.962079 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-krbvk"] Jan 25 05:51:23 crc kubenswrapper[4728]: I0125 05:51:23.334771 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17440e8a-2d96-4467-966f-d0a36694775a" path="/var/lib/kubelet/pods/17440e8a-2d96-4467-966f-d0a36694775a/volumes" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.164946 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6"] Jan 25 05:51:34 crc kubenswrapper[4728]: E0125 05:51:34.166533 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17440e8a-2d96-4467-966f-d0a36694775a" containerName="registry-server" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.166551 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="17440e8a-2d96-4467-966f-d0a36694775a" containerName="registry-server" Jan 25 05:51:34 crc kubenswrapper[4728]: E0125 05:51:34.166564 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17440e8a-2d96-4467-966f-d0a36694775a" containerName="extract-content" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.166571 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="17440e8a-2d96-4467-966f-d0a36694775a" containerName="extract-content" Jan 25 05:51:34 crc kubenswrapper[4728]: E0125 05:51:34.166578 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17440e8a-2d96-4467-966f-d0a36694775a" containerName="extract-utilities" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.166590 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="17440e8a-2d96-4467-966f-d0a36694775a" containerName="extract-utilities" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.166756 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="17440e8a-2d96-4467-966f-d0a36694775a" containerName="registry-server" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.167346 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.168657 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.169491 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.169928 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xxp9c" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.171085 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-gpr7x" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.175953 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.179488 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.186821 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.187846 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.189180 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-r7k2c" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.196785 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.203007 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.203731 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.205414 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-glffk" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.221092 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.231778 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.232382 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.233484 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jm7w9" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.238590 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.239133 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.241333 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.241751 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-lcmlt" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.250208 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.271952 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.285638 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.332299 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.332966 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5jnxx" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.337370 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.341154 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.341982 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.343673 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.344778 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.349795 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.350482 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rvtkz" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.350792 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fx695" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.358531 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.361567 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.362703 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.363681 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwh8z\" (UniqueName: \"kubernetes.io/projected/c906591a-0a65-447e-a795-aa7fb38c64bb-kube-api-access-zwh8z\") pod \"barbican-operator-controller-manager-7f86f8796f-ztxz6\" (UID: \"c906591a-0a65-447e-a795-aa7fb38c64bb\") " pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.363735 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69fhv\" (UniqueName: \"kubernetes.io/projected/3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6-kube-api-access-69fhv\") pod \"horizon-operator-controller-manager-77d5c5b54f-nkvhs\" (UID: \"3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.363756 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h9ql\" (UniqueName: \"kubernetes.io/projected/dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6-kube-api-access-2h9ql\") pod \"glance-operator-controller-manager-78fdd796fd-f2dft\" (UID: \"dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.363795 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwb4k\" (UniqueName: \"kubernetes.io/projected/13aad2b9-2318-480d-990b-e0627fa9b671-kube-api-access-kwb4k\") pod \"designate-operator-controller-manager-b45d7bf98-9gflx\" (UID: \"13aad2b9-2318-480d-990b-e0627fa9b671\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.363824 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs47p\" (UniqueName: \"kubernetes.io/projected/f171accd-8da2-4cf6-a195-536365fbeceb-kube-api-access-vs47p\") pod \"cinder-operator-controller-manager-7478f7dbf9-qgwgl\" (UID: \"f171accd-8da2-4cf6-a195-536365fbeceb\") " pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.363842 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zf4t\" (UniqueName: \"kubernetes.io/projected/788a3e7e-9822-4c83-a7b8-0673f1dcbf6d-kube-api-access-7zf4t\") pod \"heat-operator-controller-manager-594c8c9d5d-pk7js\" (UID: \"788a3e7e-9822-4c83-a7b8-0673f1dcbf6d\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.366234 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8g5sc" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.372828 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.379064 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.380127 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.382081 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-t6z5b" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.394369 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.405659 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.406505 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.408851 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-gkpcn" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.426609 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.427623 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.428843 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.429413 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.430665 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-k5bjj" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.434701 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-slwrq" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.435083 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.437985 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.441434 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.449489 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.450064 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.451440 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fssw5" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.452114 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.458729 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.459414 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.461912 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-g649v" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.464777 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zxtv\" (UniqueName: \"kubernetes.io/projected/5ac48f0b-9ef8-427e-b07c-2318e909b080-kube-api-access-5zxtv\") pod \"manila-operator-controller-manager-78c6999f6f-ck9lm\" (UID: \"5ac48f0b-9ef8-427e-b07c-2318e909b080\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.464814 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtjr\" (UniqueName: \"kubernetes.io/projected/700f794a-9dd3-4cea-bdb4-0f17e7faa246-kube-api-access-7wtjr\") pod \"keystone-operator-controller-manager-b8b6d4659-tzts2\" (UID: \"700f794a-9dd3-4cea-bdb4-0f17e7faa246\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.464847 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj4rz\" (UniqueName: \"kubernetes.io/projected/ad255789-2727-45f9-a389-fee59b5a141a-kube-api-access-dj4rz\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.464872 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwb4k\" (UniqueName: \"kubernetes.io/projected/13aad2b9-2318-480d-990b-e0627fa9b671-kube-api-access-kwb4k\") pod \"designate-operator-controller-manager-b45d7bf98-9gflx\" (UID: \"13aad2b9-2318-480d-990b-e0627fa9b671\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.464901 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs47p\" (UniqueName: \"kubernetes.io/projected/f171accd-8da2-4cf6-a195-536365fbeceb-kube-api-access-vs47p\") pod \"cinder-operator-controller-manager-7478f7dbf9-qgwgl\" (UID: \"f171accd-8da2-4cf6-a195-536365fbeceb\") " pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.464922 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zf4t\" (UniqueName: \"kubernetes.io/projected/788a3e7e-9822-4c83-a7b8-0673f1dcbf6d-kube-api-access-7zf4t\") pod \"heat-operator-controller-manager-594c8c9d5d-pk7js\" (UID: \"788a3e7e-9822-4c83-a7b8-0673f1dcbf6d\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.464943 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djwgt\" (UniqueName: \"kubernetes.io/projected/694de73b-9b23-4f4c-a54d-bdd806df4e20-kube-api-access-djwgt\") pod \"ironic-operator-controller-manager-598f7747c9-p2f2m\" (UID: \"694de73b-9b23-4f4c-a54d-bdd806df4e20\") " pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.464965 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwh8z\" (UniqueName: \"kubernetes.io/projected/c906591a-0a65-447e-a795-aa7fb38c64bb-kube-api-access-zwh8z\") pod \"barbican-operator-controller-manager-7f86f8796f-ztxz6\" (UID: \"c906591a-0a65-447e-a795-aa7fb38c64bb\") " pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.465006 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69fhv\" (UniqueName: \"kubernetes.io/projected/3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6-kube-api-access-69fhv\") pod \"horizon-operator-controller-manager-77d5c5b54f-nkvhs\" (UID: \"3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.465022 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h9ql\" (UniqueName: \"kubernetes.io/projected/dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6-kube-api-access-2h9ql\") pod \"glance-operator-controller-manager-78fdd796fd-f2dft\" (UID: \"dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.465043 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.468969 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.469916 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.473987 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6cxl5" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.474109 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.483211 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.485859 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zf4t\" (UniqueName: \"kubernetes.io/projected/788a3e7e-9822-4c83-a7b8-0673f1dcbf6d-kube-api-access-7zf4t\") pod \"heat-operator-controller-manager-594c8c9d5d-pk7js\" (UID: \"788a3e7e-9822-4c83-a7b8-0673f1dcbf6d\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.486146 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs47p\" (UniqueName: \"kubernetes.io/projected/f171accd-8da2-4cf6-a195-536365fbeceb-kube-api-access-vs47p\") pod \"cinder-operator-controller-manager-7478f7dbf9-qgwgl\" (UID: \"f171accd-8da2-4cf6-a195-536365fbeceb\") " pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.486300 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69fhv\" (UniqueName: \"kubernetes.io/projected/3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6-kube-api-access-69fhv\") pod \"horizon-operator-controller-manager-77d5c5b54f-nkvhs\" (UID: \"3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.488885 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h9ql\" (UniqueName: \"kubernetes.io/projected/dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6-kube-api-access-2h9ql\") pod \"glance-operator-controller-manager-78fdd796fd-f2dft\" (UID: \"dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.490087 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.492913 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwb4k\" (UniqueName: \"kubernetes.io/projected/13aad2b9-2318-480d-990b-e0627fa9b671-kube-api-access-kwb4k\") pod \"designate-operator-controller-manager-b45d7bf98-9gflx\" (UID: \"13aad2b9-2318-480d-990b-e0627fa9b671\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.495507 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.499835 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwh8z\" (UniqueName: \"kubernetes.io/projected/c906591a-0a65-447e-a795-aa7fb38c64bb-kube-api-access-zwh8z\") pod \"barbican-operator-controller-manager-7f86f8796f-ztxz6\" (UID: \"c906591a-0a65-447e-a795-aa7fb38c64bb\") " pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.502377 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.506169 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.507139 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.508870 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-lsvkk" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.514134 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.514358 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.546129 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.554182 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.573731 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj4rz\" (UniqueName: \"kubernetes.io/projected/ad255789-2727-45f9-a389-fee59b5a141a-kube-api-access-dj4rz\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.573835 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fpm2\" (UniqueName: \"kubernetes.io/projected/e0975e48-db18-44dc-99d7-524b381ad58d-kube-api-access-5fpm2\") pod \"ovn-operator-controller-manager-6f75f45d54-h4hvm\" (UID: \"e0975e48-db18-44dc-99d7-524b381ad58d\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583112 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2llt\" (UniqueName: \"kubernetes.io/projected/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-kube-api-access-l2llt\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583170 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccb72\" (UniqueName: \"kubernetes.io/projected/9db3c1e7-c92f-40d3-8ff9-ef86e1376688-kube-api-access-ccb72\") pod \"neutron-operator-controller-manager-78d58447c5-b65wc\" (UID: \"9db3c1e7-c92f-40d3-8ff9-ef86e1376688\") " pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583200 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djwgt\" (UniqueName: \"kubernetes.io/projected/694de73b-9b23-4f4c-a54d-bdd806df4e20-kube-api-access-djwgt\") pod \"ironic-operator-controller-manager-598f7747c9-p2f2m\" (UID: \"694de73b-9b23-4f4c-a54d-bdd806df4e20\") " pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583248 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6pjp\" (UniqueName: \"kubernetes.io/projected/8e366d4f-b864-47e2-a289-19f97f76a38a-kube-api-access-t6pjp\") pod \"placement-operator-controller-manager-79d5ccc684-mknzf\" (UID: \"8e366d4f-b864-47e2-a289-19f97f76a38a\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583294 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c7g2\" (UniqueName: \"kubernetes.io/projected/733ecefc-22d8-4a52-9540-09b4aac018e1-kube-api-access-2c7g2\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-62h6z\" (UID: \"733ecefc-22d8-4a52-9540-09b4aac018e1\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583359 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583411 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583468 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zxtv\" (UniqueName: \"kubernetes.io/projected/5ac48f0b-9ef8-427e-b07c-2318e909b080-kube-api-access-5zxtv\") pod \"manila-operator-controller-manager-78c6999f6f-ck9lm\" (UID: \"5ac48f0b-9ef8-427e-b07c-2318e909b080\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583512 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8gsv\" (UniqueName: \"kubernetes.io/projected/663cb342-06a8-4ee4-8e1b-6b2658e1781f-kube-api-access-c8gsv\") pod \"octavia-operator-controller-manager-5f4cd88d46-982pl\" (UID: \"663cb342-06a8-4ee4-8e1b-6b2658e1781f\") " pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583532 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjg9n\" (UniqueName: \"kubernetes.io/projected/5f8b10f8-34e5-4250-ade1-7d47b008a4d6-kube-api-access-vjg9n\") pod \"nova-operator-controller-manager-7bdb645866-8kptt\" (UID: \"5f8b10f8-34e5-4250-ade1-7d47b008a4d6\") " pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.583555 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtjr\" (UniqueName: \"kubernetes.io/projected/700f794a-9dd3-4cea-bdb4-0f17e7faa246-kube-api-access-7wtjr\") pod \"keystone-operator-controller-manager-b8b6d4659-tzts2\" (UID: \"700f794a-9dd3-4cea-bdb4-0f17e7faa246\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" Jan 25 05:51:34 crc kubenswrapper[4728]: E0125 05:51:34.583647 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:34 crc kubenswrapper[4728]: E0125 05:51:34.583708 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert podName:ad255789-2727-45f9-a389-fee59b5a141a nodeName:}" failed. No retries permitted until 2026-01-25 05:51:35.083692018 +0000 UTC m=+786.119569999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert") pod "infra-operator-controller-manager-694cf4f878-5dxmw" (UID: "ad255789-2727-45f9-a389-fee59b5a141a") : secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.595855 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.602457 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.606780 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-r8l9l" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.607925 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj4rz\" (UniqueName: \"kubernetes.io/projected/ad255789-2727-45f9-a389-fee59b5a141a-kube-api-access-dj4rz\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.613093 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djwgt\" (UniqueName: \"kubernetes.io/projected/694de73b-9b23-4f4c-a54d-bdd806df4e20-kube-api-access-djwgt\") pod \"ironic-operator-controller-manager-598f7747c9-p2f2m\" (UID: \"694de73b-9b23-4f4c-a54d-bdd806df4e20\") " pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.614025 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.615538 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zxtv\" (UniqueName: \"kubernetes.io/projected/5ac48f0b-9ef8-427e-b07c-2318e909b080-kube-api-access-5zxtv\") pod \"manila-operator-controller-manager-78c6999f6f-ck9lm\" (UID: \"5ac48f0b-9ef8-427e-b07c-2318e909b080\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.620620 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtjr\" (UniqueName: \"kubernetes.io/projected/700f794a-9dd3-4cea-bdb4-0f17e7faa246-kube-api-access-7wtjr\") pod \"keystone-operator-controller-manager-b8b6d4659-tzts2\" (UID: \"700f794a-9dd3-4cea-bdb4-0f17e7faa246\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.664237 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.669592 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.671382 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.675522 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.675595 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-gg6vt" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.679929 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.684785 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.684826 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gsv\" (UniqueName: \"kubernetes.io/projected/663cb342-06a8-4ee4-8e1b-6b2658e1781f-kube-api-access-c8gsv\") pod \"octavia-operator-controller-manager-5f4cd88d46-982pl\" (UID: \"663cb342-06a8-4ee4-8e1b-6b2658e1781f\") " pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.684848 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjg9n\" (UniqueName: \"kubernetes.io/projected/5f8b10f8-34e5-4250-ade1-7d47b008a4d6-kube-api-access-vjg9n\") pod \"nova-operator-controller-manager-7bdb645866-8kptt\" (UID: \"5f8b10f8-34e5-4250-ade1-7d47b008a4d6\") " pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.684888 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7j8k\" (UniqueName: \"kubernetes.io/projected/4e555f00-c133-4b06-b5df-005238b0541d-kube-api-access-x7j8k\") pod \"test-operator-controller-manager-69797bbcbd-jhgs5\" (UID: \"4e555f00-c133-4b06-b5df-005238b0541d\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.684913 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fpm2\" (UniqueName: \"kubernetes.io/projected/e0975e48-db18-44dc-99d7-524b381ad58d-kube-api-access-5fpm2\") pod \"ovn-operator-controller-manager-6f75f45d54-h4hvm\" (UID: \"e0975e48-db18-44dc-99d7-524b381ad58d\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.684930 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2llt\" (UniqueName: \"kubernetes.io/projected/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-kube-api-access-l2llt\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.684949 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc54c\" (UniqueName: \"kubernetes.io/projected/94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4-kube-api-access-gc54c\") pod \"telemetry-operator-controller-manager-85cd9769bb-r6mh4\" (UID: \"94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.684977 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccb72\" (UniqueName: \"kubernetes.io/projected/9db3c1e7-c92f-40d3-8ff9-ef86e1376688-kube-api-access-ccb72\") pod \"neutron-operator-controller-manager-78d58447c5-b65wc\" (UID: \"9db3c1e7-c92f-40d3-8ff9-ef86e1376688\") " pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.685008 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5h7\" (UniqueName: \"kubernetes.io/projected/e13158ce-126d-4980-9fbd-e7ed492ee879-kube-api-access-qv5h7\") pod \"swift-operator-controller-manager-547cbdb99f-8hkv9\" (UID: \"e13158ce-126d-4980-9fbd-e7ed492ee879\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.685025 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6pjp\" (UniqueName: \"kubernetes.io/projected/8e366d4f-b864-47e2-a289-19f97f76a38a-kube-api-access-t6pjp\") pod \"placement-operator-controller-manager-79d5ccc684-mknzf\" (UID: \"8e366d4f-b864-47e2-a289-19f97f76a38a\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.685052 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c7g2\" (UniqueName: \"kubernetes.io/projected/733ecefc-22d8-4a52-9540-09b4aac018e1-kube-api-access-2c7g2\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-62h6z\" (UID: \"733ecefc-22d8-4a52-9540-09b4aac018e1\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" Jan 25 05:51:34 crc kubenswrapper[4728]: E0125 05:51:34.685553 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:34 crc kubenswrapper[4728]: E0125 05:51:34.685596 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert podName:3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b nodeName:}" failed. No retries permitted until 2026-01-25 05:51:35.185581972 +0000 UTC m=+786.221459953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert") pod "openstack-baremetal-operator-controller-manager-848957f4b4kndnb" (UID: "3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.689164 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.706035 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8gsv\" (UniqueName: \"kubernetes.io/projected/663cb342-06a8-4ee4-8e1b-6b2658e1781f-kube-api-access-c8gsv\") pod \"octavia-operator-controller-manager-5f4cd88d46-982pl\" (UID: \"663cb342-06a8-4ee4-8e1b-6b2658e1781f\") " pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.707953 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c7g2\" (UniqueName: \"kubernetes.io/projected/733ecefc-22d8-4a52-9540-09b4aac018e1-kube-api-access-2c7g2\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-62h6z\" (UID: \"733ecefc-22d8-4a52-9540-09b4aac018e1\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.708733 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2llt\" (UniqueName: \"kubernetes.io/projected/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-kube-api-access-l2llt\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.711502 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccb72\" (UniqueName: \"kubernetes.io/projected/9db3c1e7-c92f-40d3-8ff9-ef86e1376688-kube-api-access-ccb72\") pod \"neutron-operator-controller-manager-78d58447c5-b65wc\" (UID: \"9db3c1e7-c92f-40d3-8ff9-ef86e1376688\") " pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.712049 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fpm2\" (UniqueName: \"kubernetes.io/projected/e0975e48-db18-44dc-99d7-524b381ad58d-kube-api-access-5fpm2\") pod \"ovn-operator-controller-manager-6f75f45d54-h4hvm\" (UID: \"e0975e48-db18-44dc-99d7-524b381ad58d\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.714697 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjg9n\" (UniqueName: \"kubernetes.io/projected/5f8b10f8-34e5-4250-ade1-7d47b008a4d6-kube-api-access-vjg9n\") pod \"nova-operator-controller-manager-7bdb645866-8kptt\" (UID: \"5f8b10f8-34e5-4250-ade1-7d47b008a4d6\") " pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.715645 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6pjp\" (UniqueName: \"kubernetes.io/projected/8e366d4f-b864-47e2-a289-19f97f76a38a-kube-api-access-t6pjp\") pod \"placement-operator-controller-manager-79d5ccc684-mknzf\" (UID: \"8e366d4f-b864-47e2-a289-19f97f76a38a\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.735865 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.745715 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.755379 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.778559 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.781491 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-5lhdq"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.782299 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.785735 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ctldd" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.786072 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.786354 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-5lhdq"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.790025 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5h7\" (UniqueName: \"kubernetes.io/projected/e13158ce-126d-4980-9fbd-e7ed492ee879-kube-api-access-qv5h7\") pod \"swift-operator-controller-manager-547cbdb99f-8hkv9\" (UID: \"e13158ce-126d-4980-9fbd-e7ed492ee879\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.790365 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7j8k\" (UniqueName: \"kubernetes.io/projected/4e555f00-c133-4b06-b5df-005238b0541d-kube-api-access-x7j8k\") pod \"test-operator-controller-manager-69797bbcbd-jhgs5\" (UID: \"4e555f00-c133-4b06-b5df-005238b0541d\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.790422 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc54c\" (UniqueName: \"kubernetes.io/projected/94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4-kube-api-access-gc54c\") pod \"telemetry-operator-controller-manager-85cd9769bb-r6mh4\" (UID: \"94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.810465 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc54c\" (UniqueName: \"kubernetes.io/projected/94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4-kube-api-access-gc54c\") pod \"telemetry-operator-controller-manager-85cd9769bb-r6mh4\" (UID: \"94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.813699 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5h7\" (UniqueName: \"kubernetes.io/projected/e13158ce-126d-4980-9fbd-e7ed492ee879-kube-api-access-qv5h7\") pod \"swift-operator-controller-manager-547cbdb99f-8hkv9\" (UID: \"e13158ce-126d-4980-9fbd-e7ed492ee879\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.816534 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7j8k\" (UniqueName: \"kubernetes.io/projected/4e555f00-c133-4b06-b5df-005238b0541d-kube-api-access-x7j8k\") pod \"test-operator-controller-manager-69797bbcbd-jhgs5\" (UID: \"4e555f00-c133-4b06-b5df-005238b0541d\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.882198 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.883954 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.889740 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4s47n" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.889968 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.891354 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f89p\" (UniqueName: \"kubernetes.io/projected/76e4e202-a355-4666-8e84-96486d73174c-kube-api-access-6f89p\") pod \"watcher-operator-controller-manager-564965969-5lhdq\" (UID: \"76e4e202-a355-4666-8e84-96486d73174c\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.894769 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.904635 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.913206 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.922428 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.933848 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.968069 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.971672 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.976558 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk"] Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.977613 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-znlnq" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.992133 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.992986 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgk4s\" (UniqueName: \"kubernetes.io/projected/585498aa-6031-43a2-ab1a-f52d1bef52e7-kube-api-access-dgk4s\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.993019 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bv8x\" (UniqueName: \"kubernetes.io/projected/cf80aae2-133f-475d-900a-13e8f1dec9ea-kube-api-access-8bv8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dsqrk\" (UID: \"cf80aae2-133f-475d-900a-13e8f1dec9ea\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.993067 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.993094 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f89p\" (UniqueName: \"kubernetes.io/projected/76e4e202-a355-4666-8e84-96486d73174c-kube-api-access-6f89p\") pod \"watcher-operator-controller-manager-564965969-5lhdq\" (UID: \"76e4e202-a355-4666-8e84-96486d73174c\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" Jan 25 05:51:34 crc kubenswrapper[4728]: I0125 05:51:34.999025 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.001399 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.007844 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f89p\" (UniqueName: \"kubernetes.io/projected/76e4e202-a355-4666-8e84-96486d73174c-kube-api-access-6f89p\") pod \"watcher-operator-controller-manager-564965969-5lhdq\" (UID: \"76e4e202-a355-4666-8e84-96486d73174c\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.019405 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.025001 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.036342 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.093861 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bv8x\" (UniqueName: \"kubernetes.io/projected/cf80aae2-133f-475d-900a-13e8f1dec9ea-kube-api-access-8bv8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dsqrk\" (UID: \"cf80aae2-133f-475d-900a-13e8f1dec9ea\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.093930 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.093973 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.094012 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.094040 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgk4s\" (UniqueName: \"kubernetes.io/projected/585498aa-6031-43a2-ab1a-f52d1bef52e7-kube-api-access-dgk4s\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.094147 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.094226 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:35.59420688 +0000 UTC m=+786.630084860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "webhook-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.094480 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.094602 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert podName:ad255789-2727-45f9-a389-fee59b5a141a nodeName:}" failed. No retries permitted until 2026-01-25 05:51:36.094572991 +0000 UTC m=+787.130450971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert") pod "infra-operator-controller-manager-694cf4f878-5dxmw" (UID: "ad255789-2727-45f9-a389-fee59b5a141a") : secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.095289 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.095430 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:35.595401243 +0000 UTC m=+786.631279223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "metrics-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.112049 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bv8x\" (UniqueName: \"kubernetes.io/projected/cf80aae2-133f-475d-900a-13e8f1dec9ea-kube-api-access-8bv8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dsqrk\" (UID: \"cf80aae2-133f-475d-900a-13e8f1dec9ea\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.113220 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgk4s\" (UniqueName: \"kubernetes.io/projected/585498aa-6031-43a2-ab1a-f52d1bef52e7-kube-api-access-dgk4s\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.143867 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.146485 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.150303 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.194939 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.195119 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.195180 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert podName:3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b nodeName:}" failed. No retries permitted until 2026-01-25 05:51:36.195164105 +0000 UTC m=+787.231042085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert") pod "openstack-baremetal-operator-controller-manager-848957f4b4kndnb" (UID: "3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.326982 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.337656 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m"] Jan 25 05:51:35 crc kubenswrapper[4728]: W0125 05:51:35.349990 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod694de73b_9b23_4f4c_a54d_bdd806df4e20.slice/crio-277e8b40b944be7197065eb30dafcdaab7f623f0b0aeaef422984b38490a808c WatchSource:0}: Error finding container 277e8b40b944be7197065eb30dafcdaab7f623f0b0aeaef422984b38490a808c: Status 404 returned error can't find the container with id 277e8b40b944be7197065eb30dafcdaab7f623f0b0aeaef422984b38490a808c Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.510783 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.518181 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.523749 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.530612 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm"] Jan 25 05:51:35 crc kubenswrapper[4728]: W0125 05:51:35.531057 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663cb342_06a8_4ee4_8e1b_6b2658e1781f.slice/crio-f1fa82015ca079648a5843ff1b3c91a2b7a8b207dfb824c3b40226b6fac25a01 WatchSource:0}: Error finding container f1fa82015ca079648a5843ff1b3c91a2b7a8b207dfb824c3b40226b6fac25a01: Status 404 returned error can't find the container with id f1fa82015ca079648a5843ff1b3c91a2b7a8b207dfb824c3b40226b6fac25a01 Jan 25 05:51:35 crc kubenswrapper[4728]: W0125 05:51:35.533232 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0975e48_db18_44dc_99d7_524b381ad58d.slice/crio-9e953a9bba21fd59bf5b22191a0582b02672d8124490784649ea9e69794dadd6 WatchSource:0}: Error finding container 9e953a9bba21fd59bf5b22191a0582b02672d8124490784649ea9e69794dadd6: Status 404 returned error can't find the container with id 9e953a9bba21fd59bf5b22191a0582b02672d8124490784649ea9e69794dadd6 Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.534276 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.539499 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.603139 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.603214 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.603361 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.603436 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:36.603416249 +0000 UTC m=+787.639294229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "webhook-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.603440 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.603509 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:36.603489668 +0000 UTC m=+787.639367638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "metrics-server-cert" not found Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.701608 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.729670 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9"] Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.736396 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qv5h7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-547cbdb99f-8hkv9_openstack-operators(e13158ce-126d-4980-9fbd-e7ed492ee879): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.740687 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" podUID="e13158ce-126d-4980-9fbd-e7ed492ee879" Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.742056 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:8abfbec47f0119a6c22c61a0ff80a4b1c6c14439a327bc75d4c529c5d8f59658,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjg9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7bdb645866-8kptt_openstack-operators(5f8b10f8-34e5-4250-ade1-7d47b008a4d6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.743784 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" podUID="5f8b10f8-34e5-4250-ade1-7d47b008a4d6" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.751228 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt"] Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.757951 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b673f00227298dcfa89abb46f8296a0825add42da41e8a4bf4dd13367c738d84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2c7g2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6b9fb5fdcb-62h6z_openstack-operators(733ecefc-22d8-4a52-9540-09b4aac018e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 25 05:51:35 crc kubenswrapper[4728]: W0125 05:51:35.758245 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e555f00_c133_4b06_b5df_005238b0541d.slice/crio-4435eb37aca14fd85e619822822d12752b77cb9a789346b47c70e2b40dd1be53 WatchSource:0}: Error finding container 4435eb37aca14fd85e619822822d12752b77cb9a789346b47c70e2b40dd1be53: Status 404 returned error can't find the container with id 4435eb37aca14fd85e619822822d12752b77cb9a789346b47c70e2b40dd1be53 Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.758249 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" event={"ID":"13aad2b9-2318-480d-990b-e0627fa9b671","Type":"ContainerStarted","Data":"b0be60afd53e2f0a42940b7502a59a5727a0ad3c1ffe9011bad2f59637eac145"} Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.759138 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" podUID="733ecefc-22d8-4a52-9540-09b4aac018e1" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.759673 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" event={"ID":"9db3c1e7-c92f-40d3-8ff9-ef86e1376688","Type":"ContainerStarted","Data":"9d4ae8a048b82d54fa5918d067be992dfe4c9472807713819a32d6cfde1c1cf0"} Jan 25 05:51:35 crc kubenswrapper[4728]: W0125 05:51:35.760350 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a07441_42a8_4cc4_bbe9_f6cc6b5e8ac4.slice/crio-0f5f26c53c670dc836a8d5f979e0c8dcffb3ae6c669de80cc9d4c32f20f94370 WatchSource:0}: Error finding container 0f5f26c53c670dc836a8d5f979e0c8dcffb3ae6c669de80cc9d4c32f20f94370: Status 404 returned error can't find the container with id 0f5f26c53c670dc836a8d5f979e0c8dcffb3ae6c669de80cc9d4c32f20f94370 Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.760488 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x7j8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-69797bbcbd-jhgs5_openstack-operators(4e555f00-c133-4b06-b5df-005238b0541d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.761461 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" event={"ID":"e13158ce-126d-4980-9fbd-e7ed492ee879","Type":"ContainerStarted","Data":"7e5544c12ea031e0cdcd2d8447397c11a36c6f0ffa5c5ee38d2344b9c4dc234a"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.762361 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4"] Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.762509 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" podUID="4e555f00-c133-4b06-b5df-005238b0541d" Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.762594 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" podUID="e13158ce-126d-4980-9fbd-e7ed492ee879" Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.762780 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gc54c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-85cd9769bb-r6mh4_openstack-operators(94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 25 05:51:35 crc kubenswrapper[4728]: W0125 05:51:35.762877 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76e4e202_a355_4666_8e84_96486d73174c.slice/crio-8933396c4655842ded45557127a6bd58f55fb15edf9bbea5f8684e7c16bac1ac WatchSource:0}: Error finding container 8933396c4655842ded45557127a6bd58f55fb15edf9bbea5f8684e7c16bac1ac: Status 404 returned error can't find the container with id 8933396c4655842ded45557127a6bd58f55fb15edf9bbea5f8684e7c16bac1ac Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.763917 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" podUID="94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.766595 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" event={"ID":"663cb342-06a8-4ee4-8e1b-6b2658e1781f","Type":"ContainerStarted","Data":"f1fa82015ca079648a5843ff1b3c91a2b7a8b207dfb824c3b40226b6fac25a01"} Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.767910 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6f89p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-5lhdq_openstack-operators(76e4e202-a355-4666-8e84-96486d73174c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.768818 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" event={"ID":"3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6","Type":"ContainerStarted","Data":"5aaedc67408079a5e1fb90b1b9aa7223f3cb4029d84d5eddfab5f1f3ba1de90f"} Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.769004 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" podUID="76e4e202-a355-4666-8e84-96486d73174c" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.774051 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" event={"ID":"700f794a-9dd3-4cea-bdb4-0f17e7faa246","Type":"ContainerStarted","Data":"9d79048036edae6c7e3ca3c496fb8daefed7effe2f2990e25ea3a58010c75cc1"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.775012 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-5lhdq"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.776797 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" event={"ID":"f171accd-8da2-4cf6-a195-536365fbeceb","Type":"ContainerStarted","Data":"b131ed527fc39fbfb8532c09ef0430b57f220712735aefa47dda3c81b1f3a192"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.777991 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" event={"ID":"5ac48f0b-9ef8-427e-b07c-2318e909b080","Type":"ContainerStarted","Data":"abab367875df1eea3ac159f319c71cb250c4a4a8113ade91d3da8bed3e102e27"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.779600 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" event={"ID":"694de73b-9b23-4f4c-a54d-bdd806df4e20","Type":"ContainerStarted","Data":"277e8b40b944be7197065eb30dafcdaab7f623f0b0aeaef422984b38490a808c"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.780520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" event={"ID":"788a3e7e-9822-4c83-a7b8-0673f1dcbf6d","Type":"ContainerStarted","Data":"31f5c48a22d1dad7e471e3d72eed9572fdf970bb109738cb055fcb205d510347"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.781811 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" event={"ID":"dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6","Type":"ContainerStarted","Data":"f004317a3947a6819bb25657da57ae1d7d20adf7b9350f9a040454b5332e4385"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.782879 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" event={"ID":"5f8b10f8-34e5-4250-ade1-7d47b008a4d6","Type":"ContainerStarted","Data":"479a032f9a96fa6c933c1bc7abf0fdc7fca0204bb3e32e1954d5d359c8bbb9be"} Jan 25 05:51:35 crc kubenswrapper[4728]: E0125 05:51:35.784161 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:8abfbec47f0119a6c22c61a0ff80a4b1c6c14439a327bc75d4c529c5d8f59658\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" podUID="5f8b10f8-34e5-4250-ade1-7d47b008a4d6" Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.784586 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.785382 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" event={"ID":"c906591a-0a65-447e-a795-aa7fb38c64bb","Type":"ContainerStarted","Data":"e53355129953eb81bff12ae5f89a10ff36254b4f44ff310588f02fbc0348c76c"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.786571 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" event={"ID":"e0975e48-db18-44dc-99d7-524b381ad58d","Type":"ContainerStarted","Data":"9e953a9bba21fd59bf5b22191a0582b02672d8124490784649ea9e69794dadd6"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.787425 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" event={"ID":"8e366d4f-b864-47e2-a289-19f97f76a38a","Type":"ContainerStarted","Data":"de0dd4ece176de3313b7156608595cf9c16d84eea83dd4a1e5ca9a2f014215ba"} Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.789483 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5"] Jan 25 05:51:35 crc kubenswrapper[4728]: I0125 05:51:35.862575 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk"] Jan 25 05:51:35 crc kubenswrapper[4728]: W0125 05:51:35.865813 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf80aae2_133f_475d_900a_13e8f1dec9ea.slice/crio-ea913377374a98349df7423b09b35aa866c736a3921108a00c2556d15567cbf8 WatchSource:0}: Error finding container ea913377374a98349df7423b09b35aa866c736a3921108a00c2556d15567cbf8: Status 404 returned error can't find the container with id ea913377374a98349df7423b09b35aa866c736a3921108a00c2556d15567cbf8 Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.109920 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.110118 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.110205 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert podName:ad255789-2727-45f9-a389-fee59b5a141a nodeName:}" failed. No retries permitted until 2026-01-25 05:51:38.110187861 +0000 UTC m=+789.146065842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert") pod "infra-operator-controller-manager-694cf4f878-5dxmw" (UID: "ad255789-2727-45f9-a389-fee59b5a141a") : secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.210570 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.210726 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.211073 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert podName:3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b nodeName:}" failed. No retries permitted until 2026-01-25 05:51:38.211059365 +0000 UTC m=+789.246937345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert") pod "openstack-baremetal-operator-controller-manager-848957f4b4kndnb" (UID: "3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.614430 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.614539 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.614567 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.614611 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.614616 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:38.614603128 +0000 UTC m=+789.650481108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "metrics-server-cert" not found Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.614647 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:38.614638345 +0000 UTC m=+789.650516326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "webhook-server-cert" not found Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.795860 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" event={"ID":"94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4","Type":"ContainerStarted","Data":"0f5f26c53c670dc836a8d5f979e0c8dcffb3ae6c669de80cc9d4c32f20f94370"} Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.797439 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" podUID="94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4" Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.797854 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk" event={"ID":"cf80aae2-133f-475d-900a-13e8f1dec9ea","Type":"ContainerStarted","Data":"ea913377374a98349df7423b09b35aa866c736a3921108a00c2556d15567cbf8"} Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.799864 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" event={"ID":"733ecefc-22d8-4a52-9540-09b4aac018e1","Type":"ContainerStarted","Data":"59bc99f9752f44f986c4cd02983607aa849f0cb0a5935a033505c855a2aea8a1"} Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.801460 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b673f00227298dcfa89abb46f8296a0825add42da41e8a4bf4dd13367c738d84\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" podUID="733ecefc-22d8-4a52-9540-09b4aac018e1" Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.802670 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" event={"ID":"4e555f00-c133-4b06-b5df-005238b0541d","Type":"ContainerStarted","Data":"4435eb37aca14fd85e619822822d12752b77cb9a789346b47c70e2b40dd1be53"} Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.803889 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" podUID="4e555f00-c133-4b06-b5df-005238b0541d" Jan 25 05:51:36 crc kubenswrapper[4728]: I0125 05:51:36.804686 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" event={"ID":"76e4e202-a355-4666-8e84-96486d73174c","Type":"ContainerStarted","Data":"8933396c4655842ded45557127a6bd58f55fb15edf9bbea5f8684e7c16bac1ac"} Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.805544 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:8abfbec47f0119a6c22c61a0ff80a4b1c6c14439a327bc75d4c529c5d8f59658\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" podUID="5f8b10f8-34e5-4250-ade1-7d47b008a4d6" Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.805643 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" podUID="76e4e202-a355-4666-8e84-96486d73174c" Jan 25 05:51:36 crc kubenswrapper[4728]: E0125 05:51:36.805841 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" podUID="e13158ce-126d-4980-9fbd-e7ed492ee879" Jan 25 05:51:37 crc kubenswrapper[4728]: E0125 05:51:37.814754 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" podUID="76e4e202-a355-4666-8e84-96486d73174c" Jan 25 05:51:37 crc kubenswrapper[4728]: E0125 05:51:37.814931 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" podUID="94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4" Jan 25 05:51:37 crc kubenswrapper[4728]: E0125 05:51:37.815246 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" podUID="4e555f00-c133-4b06-b5df-005238b0541d" Jan 25 05:51:37 crc kubenswrapper[4728]: E0125 05:51:37.815955 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b673f00227298dcfa89abb46f8296a0825add42da41e8a4bf4dd13367c738d84\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" podUID="733ecefc-22d8-4a52-9540-09b4aac018e1" Jan 25 05:51:38 crc kubenswrapper[4728]: I0125 05:51:38.138684 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:38 crc kubenswrapper[4728]: E0125 05:51:38.138849 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:38 crc kubenswrapper[4728]: E0125 05:51:38.138915 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert podName:ad255789-2727-45f9-a389-fee59b5a141a nodeName:}" failed. No retries permitted until 2026-01-25 05:51:42.13889806 +0000 UTC m=+793.174776041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert") pod "infra-operator-controller-manager-694cf4f878-5dxmw" (UID: "ad255789-2727-45f9-a389-fee59b5a141a") : secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:38 crc kubenswrapper[4728]: I0125 05:51:38.240025 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:38 crc kubenswrapper[4728]: E0125 05:51:38.240192 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:38 crc kubenswrapper[4728]: E0125 05:51:38.240678 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert podName:3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b nodeName:}" failed. No retries permitted until 2026-01-25 05:51:42.240662798 +0000 UTC m=+793.276540778 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert") pod "openstack-baremetal-operator-controller-manager-848957f4b4kndnb" (UID: "3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:38 crc kubenswrapper[4728]: I0125 05:51:38.644236 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:38 crc kubenswrapper[4728]: I0125 05:51:38.644371 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:38 crc kubenswrapper[4728]: E0125 05:51:38.644405 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 25 05:51:38 crc kubenswrapper[4728]: E0125 05:51:38.644483 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:42.644467133 +0000 UTC m=+793.680345113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "metrics-server-cert" not found Jan 25 05:51:38 crc kubenswrapper[4728]: E0125 05:51:38.644489 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 25 05:51:38 crc kubenswrapper[4728]: E0125 05:51:38.644540 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:42.644526876 +0000 UTC m=+793.680404857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "webhook-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: I0125 05:51:42.188036 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:42 crc kubenswrapper[4728]: E0125 05:51:42.188281 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: E0125 05:51:42.188410 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert podName:ad255789-2727-45f9-a389-fee59b5a141a nodeName:}" failed. No retries permitted until 2026-01-25 05:51:50.188386097 +0000 UTC m=+801.224264077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert") pod "infra-operator-controller-manager-694cf4f878-5dxmw" (UID: "ad255789-2727-45f9-a389-fee59b5a141a") : secret "infra-operator-webhook-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: I0125 05:51:42.289467 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:42 crc kubenswrapper[4728]: E0125 05:51:42.289761 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: E0125 05:51:42.289856 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert podName:3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b nodeName:}" failed. No retries permitted until 2026-01-25 05:51:50.289830851 +0000 UTC m=+801.325708832 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert") pod "openstack-baremetal-operator-controller-manager-848957f4b4kndnb" (UID: "3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: I0125 05:51:42.694636 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:42 crc kubenswrapper[4728]: I0125 05:51:42.694719 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:42 crc kubenswrapper[4728]: E0125 05:51:42.695210 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: E0125 05:51:42.695220 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: E0125 05:51:42.695278 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:50.695262155 +0000 UTC m=+801.731140135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "metrics-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: E0125 05:51:42.695304 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:51:50.695286542 +0000 UTC m=+801.731164552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "webhook-server-cert" not found Jan 25 05:51:42 crc kubenswrapper[4728]: I0125 05:51:42.899021 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:51:42 crc kubenswrapper[4728]: I0125 05:51:42.899073 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.908774 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" event={"ID":"e0975e48-db18-44dc-99d7-524b381ad58d","Type":"ContainerStarted","Data":"005f02dae31fc4e7bf9c5648b09998ea36c0e94ff7d38452111fe97b7a43cdc3"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.909868 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.917128 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" event={"ID":"663cb342-06a8-4ee4-8e1b-6b2658e1781f","Type":"ContainerStarted","Data":"521e0d59f2031882aeeec8eddb72d7bcc43c3765622d3e0a393f69ea02054d1c"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.917472 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.921826 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" event={"ID":"694de73b-9b23-4f4c-a54d-bdd806df4e20","Type":"ContainerStarted","Data":"e47da141791cd8ef5389956fff64c3406e7fd5de3b7521253e8ec474d3f0941b"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.921916 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.923749 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" event={"ID":"700f794a-9dd3-4cea-bdb4-0f17e7faa246","Type":"ContainerStarted","Data":"cc7d096b1b042c6fb3031c6684275ad312234403b42be07cbc172ec7385d4e4a"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.924107 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.930822 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" podStartSLOduration=2.497319339 podStartE2EDuration="13.930813024s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.539556574 +0000 UTC m=+786.575434554" lastFinishedPulling="2026-01-25 05:51:46.973050259 +0000 UTC m=+798.008928239" observedRunningTime="2026-01-25 05:51:47.926465665 +0000 UTC m=+798.962343645" watchObservedRunningTime="2026-01-25 05:51:47.930813024 +0000 UTC m=+798.966691005" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.932417 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" event={"ID":"3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6","Type":"ContainerStarted","Data":"17ff4bd05d1a2bc355103210f19e795d4ed4610d466d08c732d05c64570a2640"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.932587 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.946742 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" event={"ID":"f171accd-8da2-4cf6-a195-536365fbeceb","Type":"ContainerStarted","Data":"ec6c763f3b4df6b6501cbd4d87213612dc42f70ff5bf2ece496f158c76d7688f"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.949009 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.951980 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" podStartSLOduration=2.474336732 podStartE2EDuration="13.951961172s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.545562702 +0000 UTC m=+786.581440682" lastFinishedPulling="2026-01-25 05:51:47.023187142 +0000 UTC m=+798.059065122" observedRunningTime="2026-01-25 05:51:47.948794209 +0000 UTC m=+798.984672189" watchObservedRunningTime="2026-01-25 05:51:47.951961172 +0000 UTC m=+798.987839151" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.953037 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" event={"ID":"8e366d4f-b864-47e2-a289-19f97f76a38a","Type":"ContainerStarted","Data":"cd640768f32c67dc56a59212557777f4134df44223c632a3c89e5b45c1fb1184"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.953438 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.954482 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" event={"ID":"13aad2b9-2318-480d-990b-e0627fa9b671","Type":"ContainerStarted","Data":"99cdbffb818a4a7696d447c8a865fb12688ffc6c1823b3328c484d00c1e243b9"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.954890 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.957071 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" event={"ID":"5ac48f0b-9ef8-427e-b07c-2318e909b080","Type":"ContainerStarted","Data":"bae1707b721e1d53a76ca2619e7bdb501c96da6dce44177dfa6b0963b4953a21"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.957591 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.958787 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" event={"ID":"788a3e7e-9822-4c83-a7b8-0673f1dcbf6d","Type":"ContainerStarted","Data":"60d662c11b48033e818873879e2b554255c58f2df4c27d659394ceacdbf84687"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.958877 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.959938 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" event={"ID":"dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6","Type":"ContainerStarted","Data":"ca5ace0c11d827998dcb90ee81eed3e4bdc8fa770555092085f5d864df5e7b59"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.960129 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.963228 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" event={"ID":"9db3c1e7-c92f-40d3-8ff9-ef86e1376688","Type":"ContainerStarted","Data":"40bd94bed02e70a70199934f9118f4617d5a120331279c9279c374c0972899ea"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.963359 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.966093 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk" event={"ID":"cf80aae2-133f-475d-900a-13e8f1dec9ea","Type":"ContainerStarted","Data":"d66f24016c43863f09738448500026db97ccd2dfd8dd71b1c52385ed067fb773"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.966104 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" podStartSLOduration=2.350757378 podStartE2EDuration="13.966091152s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.357279111 +0000 UTC m=+786.393157090" lastFinishedPulling="2026-01-25 05:51:46.972612883 +0000 UTC m=+798.008490864" observedRunningTime="2026-01-25 05:51:47.965051311 +0000 UTC m=+799.000929291" watchObservedRunningTime="2026-01-25 05:51:47.966091152 +0000 UTC m=+799.001969133" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.972007 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" event={"ID":"c906591a-0a65-447e-a795-aa7fb38c64bb","Type":"ContainerStarted","Data":"1a5b0d99c2300594b48cb1c94e6e070d9c1552672422107428f95ba61c4e0094"} Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.972385 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" Jan 25 05:51:47 crc kubenswrapper[4728]: I0125 05:51:47.997586 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" podStartSLOduration=2.546512443 podStartE2EDuration="13.997573081s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.532750327 +0000 UTC m=+786.568628307" lastFinishedPulling="2026-01-25 05:51:46.983810964 +0000 UTC m=+798.019688945" observedRunningTime="2026-01-25 05:51:47.99272685 +0000 UTC m=+799.028604830" watchObservedRunningTime="2026-01-25 05:51:47.997573081 +0000 UTC m=+799.033451060" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.021378 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" podStartSLOduration=2.12344262 podStartE2EDuration="14.02135733s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.062245969 +0000 UTC m=+786.098123948" lastFinishedPulling="2026-01-25 05:51:46.960160677 +0000 UTC m=+797.996038658" observedRunningTime="2026-01-25 05:51:48.013618093 +0000 UTC m=+799.049496074" watchObservedRunningTime="2026-01-25 05:51:48.02135733 +0000 UTC m=+799.057235310" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.028543 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" podStartSLOduration=2.969883023 podStartE2EDuration="14.028528927s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.091058822 +0000 UTC m=+786.126936803" lastFinishedPulling="2026-01-25 05:51:46.149704727 +0000 UTC m=+797.185582707" observedRunningTime="2026-01-25 05:51:48.025794079 +0000 UTC m=+799.061672059" watchObservedRunningTime="2026-01-25 05:51:48.028528927 +0000 UTC m=+799.064406907" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.044602 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" podStartSLOduration=2.162026554 podStartE2EDuration="14.044590129s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.060716824 +0000 UTC m=+786.096594805" lastFinishedPulling="2026-01-25 05:51:46.9432804 +0000 UTC m=+797.979158380" observedRunningTime="2026-01-25 05:51:48.039005106 +0000 UTC m=+799.074883085" watchObservedRunningTime="2026-01-25 05:51:48.044590129 +0000 UTC m=+799.080468110" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.056796 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" podStartSLOduration=2.642954852 podStartE2EDuration="14.056776835s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.540162546 +0000 UTC m=+786.576040526" lastFinishedPulling="2026-01-25 05:51:46.953984529 +0000 UTC m=+797.989862509" observedRunningTime="2026-01-25 05:51:48.05348218 +0000 UTC m=+799.089360161" watchObservedRunningTime="2026-01-25 05:51:48.056776835 +0000 UTC m=+799.092654815" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.067154 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" podStartSLOduration=2.811615734 podStartE2EDuration="14.067146022s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.70139629 +0000 UTC m=+786.737274270" lastFinishedPulling="2026-01-25 05:51:46.956926578 +0000 UTC m=+797.992804558" observedRunningTime="2026-01-25 05:51:48.066426735 +0000 UTC m=+799.102304716" watchObservedRunningTime="2026-01-25 05:51:48.067146022 +0000 UTC m=+799.103024001" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.119228 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" podStartSLOduration=2.361453381 podStartE2EDuration="14.119215841s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.170384248 +0000 UTC m=+786.206262227" lastFinishedPulling="2026-01-25 05:51:46.928146706 +0000 UTC m=+797.964024687" observedRunningTime="2026-01-25 05:51:48.090260569 +0000 UTC m=+799.126138549" watchObservedRunningTime="2026-01-25 05:51:48.119215841 +0000 UTC m=+799.155093822" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.134282 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" podStartSLOduration=2.71118348 podStartE2EDuration="14.134268903s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.546731967 +0000 UTC m=+786.582609948" lastFinishedPulling="2026-01-25 05:51:46.969817392 +0000 UTC m=+798.005695371" observedRunningTime="2026-01-25 05:51:48.133512416 +0000 UTC m=+799.169390396" watchObservedRunningTime="2026-01-25 05:51:48.134268903 +0000 UTC m=+799.170146883" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.137598 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" podStartSLOduration=2.374808171 podStartE2EDuration="14.137592812s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.170939015 +0000 UTC m=+786.206816984" lastFinishedPulling="2026-01-25 05:51:46.933723645 +0000 UTC m=+797.969601625" observedRunningTime="2026-01-25 05:51:48.120091953 +0000 UTC m=+799.155969933" watchObservedRunningTime="2026-01-25 05:51:48.137592812 +0000 UTC m=+799.173470792" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.146395 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" podStartSLOduration=2.698702227 podStartE2EDuration="14.146385556s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.525072185 +0000 UTC m=+786.560950164" lastFinishedPulling="2026-01-25 05:51:46.972755513 +0000 UTC m=+798.008633493" observedRunningTime="2026-01-25 05:51:48.144608294 +0000 UTC m=+799.180486274" watchObservedRunningTime="2026-01-25 05:51:48.146385556 +0000 UTC m=+799.182263535" Jan 25 05:51:48 crc kubenswrapper[4728]: I0125 05:51:48.156476 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dsqrk" podStartSLOduration=3.02091852 podStartE2EDuration="14.156468303s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.868116531 +0000 UTC m=+786.903994511" lastFinishedPulling="2026-01-25 05:51:47.003666324 +0000 UTC m=+798.039544294" observedRunningTime="2026-01-25 05:51:48.153997012 +0000 UTC m=+799.189874992" watchObservedRunningTime="2026-01-25 05:51:48.156468303 +0000 UTC m=+799.192346283" Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.220722 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.235641 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad255789-2727-45f9-a389-fee59b5a141a-cert\") pod \"infra-operator-controller-manager-694cf4f878-5dxmw\" (UID: \"ad255789-2727-45f9-a389-fee59b5a141a\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.252789 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.322082 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.328729 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b-cert\") pod \"openstack-baremetal-operator-controller-manager-848957f4b4kndnb\" (UID: \"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.369058 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.639727 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw"] Jan 25 05:51:50 crc kubenswrapper[4728]: W0125 05:51:50.641407 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad255789_2727_45f9_a389_fee59b5a141a.slice/crio-c881175f60eb6e52651102486b309e663061a0f58d6d0efb08848e4a8faeeb59 WatchSource:0}: Error finding container c881175f60eb6e52651102486b309e663061a0f58d6d0efb08848e4a8faeeb59: Status 404 returned error can't find the container with id c881175f60eb6e52651102486b309e663061a0f58d6d0efb08848e4a8faeeb59 Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.729168 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.729251 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:51:50 crc kubenswrapper[4728]: E0125 05:51:50.729410 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 25 05:51:50 crc kubenswrapper[4728]: E0125 05:51:50.729458 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:52:06.729441176 +0000 UTC m=+817.765319155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "metrics-server-cert" not found Jan 25 05:51:50 crc kubenswrapper[4728]: E0125 05:51:50.729787 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 25 05:51:50 crc kubenswrapper[4728]: E0125 05:51:50.729905 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs podName:585498aa-6031-43a2-ab1a-f52d1bef52e7 nodeName:}" failed. No retries permitted until 2026-01-25 05:52:06.729887597 +0000 UTC m=+817.765765578 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs") pod "openstack-operator-controller-manager-65d46cfd44-wsx7f" (UID: "585498aa-6031-43a2-ab1a-f52d1bef52e7") : secret "webhook-server-cert" not found Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.765677 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb"] Jan 25 05:51:50 crc kubenswrapper[4728]: W0125 05:51:50.767220 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae373b2_dc3c_4c6d_b2bb_69a15bc1d52b.slice/crio-f412e05a2fc3293a36ad5363e4c01836278484f423c37974d7ec7eac80fc1e35 WatchSource:0}: Error finding container f412e05a2fc3293a36ad5363e4c01836278484f423c37974d7ec7eac80fc1e35: Status 404 returned error can't find the container with id f412e05a2fc3293a36ad5363e4c01836278484f423c37974d7ec7eac80fc1e35 Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.995877 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" event={"ID":"ad255789-2727-45f9-a389-fee59b5a141a","Type":"ContainerStarted","Data":"c881175f60eb6e52651102486b309e663061a0f58d6d0efb08848e4a8faeeb59"} Jan 25 05:51:50 crc kubenswrapper[4728]: I0125 05:51:50.997081 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" event={"ID":"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b","Type":"ContainerStarted","Data":"f412e05a2fc3293a36ad5363e4c01836278484f423c37974d7ec7eac80fc1e35"} Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.020412 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" event={"ID":"e13158ce-126d-4980-9fbd-e7ed492ee879","Type":"ContainerStarted","Data":"bd61229f43c65633f364edcecb34f1f2e00bac62b149ead8c078472591a2775d"} Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.021682 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.037839 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" podStartSLOduration=2.197329209 podStartE2EDuration="20.037812052s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.736070599 +0000 UTC m=+786.771948579" lastFinishedPulling="2026-01-25 05:51:53.576553442 +0000 UTC m=+804.612431422" observedRunningTime="2026-01-25 05:51:54.034618378 +0000 UTC m=+805.070496358" watchObservedRunningTime="2026-01-25 05:51:54.037812052 +0000 UTC m=+805.073690031" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.498841 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-qgwgl" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.504377 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-9gflx" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.518366 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-f2dft" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.560238 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-pk7js" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.561543 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-nkvhs" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.667429 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-p2f2m" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.685679 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-tzts2" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.693890 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-ck9lm" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.741876 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-b65wc" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.761278 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-982pl" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.784644 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-h4hvm" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.792488 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-ztxz6" Jan 25 05:51:54 crc kubenswrapper[4728]: I0125 05:51:54.907562 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-mknzf" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.060652 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" event={"ID":"5f8b10f8-34e5-4250-ade1-7d47b008a4d6","Type":"ContainerStarted","Data":"b3ab77ea29d2d523874f19fff09e35d3cc0dbe750cfe99a4182ece5f85ef3d67"} Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.061074 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.061932 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" event={"ID":"4e555f00-c133-4b06-b5df-005238b0541d","Type":"ContainerStarted","Data":"2d6cf06164dcf7583176a4f7d0bde165340bf77e897d565a4f260a522078935b"} Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.062139 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.063068 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" event={"ID":"76e4e202-a355-4666-8e84-96486d73174c","Type":"ContainerStarted","Data":"d24812132052fa80bad7a00826d12c91070e6b92949c09cc1efb351a803c0aae"} Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.063260 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.064697 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" event={"ID":"94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4","Type":"ContainerStarted","Data":"56f6fe79a72c22135430b5236fc0c964b7e5fea7d826172cf97317ae5e75b55b"} Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.065563 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.066084 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" event={"ID":"3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b","Type":"ContainerStarted","Data":"fa0c5183b2e8e48cfbbba9c53008f5a1a47970d843039523a72816fede96e509"} Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.066802 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.068243 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" event={"ID":"ad255789-2727-45f9-a389-fee59b5a141a","Type":"ContainerStarted","Data":"29d8f86a7d0c09b059f4d172a4239644903c052c03b1e336b7367963fda31ef2"} Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.068637 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.069893 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" event={"ID":"733ecefc-22d8-4a52-9540-09b4aac018e1","Type":"ContainerStarted","Data":"211e80c567c66efb933251ac46ce37ab102913a55d8c34dbfdc1cc7ac486a397"} Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.070276 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.081524 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" podStartSLOduration=2.857865655 podStartE2EDuration="25.081512391s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.741914862 +0000 UTC m=+786.777792842" lastFinishedPulling="2026-01-25 05:51:57.965561598 +0000 UTC m=+809.001439578" observedRunningTime="2026-01-25 05:51:59.078144479 +0000 UTC m=+810.114022459" watchObservedRunningTime="2026-01-25 05:51:59.081512391 +0000 UTC m=+810.117390372" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.110107 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" podStartSLOduration=17.91450802 podStartE2EDuration="25.110098167s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:50.770026594 +0000 UTC m=+801.805904574" lastFinishedPulling="2026-01-25 05:51:57.965616741 +0000 UTC m=+809.001494721" observedRunningTime="2026-01-25 05:51:59.107177148 +0000 UTC m=+810.143055128" watchObservedRunningTime="2026-01-25 05:51:59.110098167 +0000 UTC m=+810.145976137" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.126148 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" podStartSLOduration=2.899750193 podStartE2EDuration="25.126133561s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.757791156 +0000 UTC m=+786.793669135" lastFinishedPulling="2026-01-25 05:51:57.984174523 +0000 UTC m=+809.020052503" observedRunningTime="2026-01-25 05:51:59.121882113 +0000 UTC m=+810.157760092" watchObservedRunningTime="2026-01-25 05:51:59.126133561 +0000 UTC m=+810.162011541" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.135780 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" podStartSLOduration=17.795114741 podStartE2EDuration="25.135770598s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:50.643526692 +0000 UTC m=+801.679404671" lastFinishedPulling="2026-01-25 05:51:57.984182558 +0000 UTC m=+809.020060528" observedRunningTime="2026-01-25 05:51:59.133619341 +0000 UTC m=+810.169497321" watchObservedRunningTime="2026-01-25 05:51:59.135770598 +0000 UTC m=+810.171648579" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.145561 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" podStartSLOduration=2.877811887 podStartE2EDuration="25.145551205s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.76254407 +0000 UTC m=+786.798422051" lastFinishedPulling="2026-01-25 05:51:58.030283389 +0000 UTC m=+809.066161369" observedRunningTime="2026-01-25 05:51:59.144536181 +0000 UTC m=+810.180414161" watchObservedRunningTime="2026-01-25 05:51:59.145551205 +0000 UTC m=+810.181429185" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.158723 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" podStartSLOduration=2.960903845 podStartE2EDuration="25.158714933s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.76780391 +0000 UTC m=+786.803681890" lastFinishedPulling="2026-01-25 05:51:57.965614998 +0000 UTC m=+809.001492978" observedRunningTime="2026-01-25 05:51:59.154855654 +0000 UTC m=+810.190733634" watchObservedRunningTime="2026-01-25 05:51:59.158714933 +0000 UTC m=+810.194592913" Jan 25 05:51:59 crc kubenswrapper[4728]: I0125 05:51:59.181038 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" podStartSLOduration=2.9558886060000003 podStartE2EDuration="25.181003992s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="2026-01-25 05:51:35.760365471 +0000 UTC m=+786.796243451" lastFinishedPulling="2026-01-25 05:51:57.985480857 +0000 UTC m=+809.021358837" observedRunningTime="2026-01-25 05:51:59.174530242 +0000 UTC m=+810.210408222" watchObservedRunningTime="2026-01-25 05:51:59.181003992 +0000 UTC m=+810.216881972" Jan 25 05:52:04 crc kubenswrapper[4728]: I0125 05:52:04.748538 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-8kptt" Jan 25 05:52:04 crc kubenswrapper[4728]: I0125 05:52:04.926595 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8hkv9" Jan 25 05:52:04 crc kubenswrapper[4728]: I0125 05:52:04.936815 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-r6mh4" Jan 25 05:52:05 crc kubenswrapper[4728]: I0125 05:52:05.006949 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-jhgs5" Jan 25 05:52:05 crc kubenswrapper[4728]: I0125 05:52:05.011930 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-62h6z" Jan 25 05:52:05 crc kubenswrapper[4728]: I0125 05:52:05.147813 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-5lhdq" Jan 25 05:52:06 crc kubenswrapper[4728]: I0125 05:52:06.749355 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:52:06 crc kubenswrapper[4728]: I0125 05:52:06.749479 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:52:06 crc kubenswrapper[4728]: I0125 05:52:06.755653 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-webhook-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:52:06 crc kubenswrapper[4728]: I0125 05:52:06.755755 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/585498aa-6031-43a2-ab1a-f52d1bef52e7-metrics-certs\") pod \"openstack-operator-controller-manager-65d46cfd44-wsx7f\" (UID: \"585498aa-6031-43a2-ab1a-f52d1bef52e7\") " pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:52:07 crc kubenswrapper[4728]: I0125 05:52:07.025411 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:52:07 crc kubenswrapper[4728]: I0125 05:52:07.411981 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f"] Jan 25 05:52:07 crc kubenswrapper[4728]: W0125 05:52:07.419593 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585498aa_6031_43a2_ab1a_f52d1bef52e7.slice/crio-dd4a7c3a8bced04847b6711c1f0e3170499a11597ed4c14e293acb7c7b767d7f WatchSource:0}: Error finding container dd4a7c3a8bced04847b6711c1f0e3170499a11597ed4c14e293acb7c7b767d7f: Status 404 returned error can't find the container with id dd4a7c3a8bced04847b6711c1f0e3170499a11597ed4c14e293acb7c7b767d7f Jan 25 05:52:08 crc kubenswrapper[4728]: I0125 05:52:08.134116 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" event={"ID":"585498aa-6031-43a2-ab1a-f52d1bef52e7","Type":"ContainerStarted","Data":"9f0e06ff86f1aae9d28477c435884c771b7c138f04f01f5bf4756cbe06608824"} Jan 25 05:52:08 crc kubenswrapper[4728]: I0125 05:52:08.134491 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" event={"ID":"585498aa-6031-43a2-ab1a-f52d1bef52e7","Type":"ContainerStarted","Data":"dd4a7c3a8bced04847b6711c1f0e3170499a11597ed4c14e293acb7c7b767d7f"} Jan 25 05:52:08 crc kubenswrapper[4728]: I0125 05:52:08.134511 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:52:08 crc kubenswrapper[4728]: I0125 05:52:08.156010 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" podStartSLOduration=34.15599655 podStartE2EDuration="34.15599655s" podCreationTimestamp="2026-01-25 05:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:52:08.153781543 +0000 UTC m=+819.189659523" watchObservedRunningTime="2026-01-25 05:52:08.15599655 +0000 UTC m=+819.191874530" Jan 25 05:52:10 crc kubenswrapper[4728]: I0125 05:52:10.258744 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-5dxmw" Jan 25 05:52:10 crc kubenswrapper[4728]: I0125 05:52:10.373566 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-848957f4b4kndnb" Jan 25 05:52:12 crc kubenswrapper[4728]: I0125 05:52:12.899718 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:52:12 crc kubenswrapper[4728]: I0125 05:52:12.900094 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:52:17 crc kubenswrapper[4728]: I0125 05:52:17.030636 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-65d46cfd44-wsx7f" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.141134 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d89c95f8f-g82r5"] Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.143535 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.146079 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.147546 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-94fzn" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.147714 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.148490 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.150571 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d89c95f8f-g82r5"] Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.181110 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cdfb76457-cr7br"] Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.182230 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.185179 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.195857 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cdfb76457-cr7br"] Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.214064 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrf5g\" (UniqueName: \"kubernetes.io/projected/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-kube-api-access-nrf5g\") pod \"dnsmasq-dns-6d89c95f8f-g82r5\" (UID: \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\") " pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.214234 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-config\") pod \"dnsmasq-dns-6d89c95f8f-g82r5\" (UID: \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\") " pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.315676 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-dns-svc\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.315975 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-config\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.316084 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrf5g\" (UniqueName: \"kubernetes.io/projected/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-kube-api-access-nrf5g\") pod \"dnsmasq-dns-6d89c95f8f-g82r5\" (UID: \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\") " pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.316165 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-config\") pod \"dnsmasq-dns-6d89c95f8f-g82r5\" (UID: \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\") " pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.316297 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5l6q\" (UniqueName: \"kubernetes.io/projected/6f9eb7e1-0883-4fcd-a869-064a777694bf-kube-api-access-q5l6q\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.317529 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-config\") pod \"dnsmasq-dns-6d89c95f8f-g82r5\" (UID: \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\") " pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.334972 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrf5g\" (UniqueName: \"kubernetes.io/projected/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-kube-api-access-nrf5g\") pod \"dnsmasq-dns-6d89c95f8f-g82r5\" (UID: \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\") " pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.416953 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-dns-svc\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.417010 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-config\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.417143 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5l6q\" (UniqueName: \"kubernetes.io/projected/6f9eb7e1-0883-4fcd-a869-064a777694bf-kube-api-access-q5l6q\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.417814 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-dns-svc\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.417909 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-config\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.432505 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5l6q\" (UniqueName: \"kubernetes.io/projected/6f9eb7e1-0883-4fcd-a869-064a777694bf-kube-api-access-q5l6q\") pod \"dnsmasq-dns-cdfb76457-cr7br\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.461094 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.507047 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.853595 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d89c95f8f-g82r5"] Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.856089 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 05:52:35 crc kubenswrapper[4728]: I0125 05:52:35.901820 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cdfb76457-cr7br"] Jan 25 05:52:36 crc kubenswrapper[4728]: I0125 05:52:36.335071 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" event={"ID":"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252","Type":"ContainerStarted","Data":"e9d841c094cf45294730fdb0fb66112766ec6cb0cb532f5150306e91be1735f9"} Jan 25 05:52:36 crc kubenswrapper[4728]: I0125 05:52:36.337612 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cdfb76457-cr7br" event={"ID":"6f9eb7e1-0883-4fcd-a869-064a777694bf","Type":"ContainerStarted","Data":"889ad8ba30b061330a36d8975297ce4a3b969a7f5cde634c4de5fa0cd62acdf8"} Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.127340 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f9dmg"] Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.129273 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.140095 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9dmg"] Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.156823 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqffq\" (UniqueName: \"kubernetes.io/projected/35e52af1-2297-446a-a0ca-bb4d718783c6-kube-api-access-lqffq\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.156882 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-catalog-content\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.156913 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-utilities\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.169805 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d89c95f8f-g82r5"] Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.200478 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5847c87bfc-nlp42"] Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.201589 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.210115 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5847c87bfc-nlp42"] Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.258088 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqffq\" (UniqueName: \"kubernetes.io/projected/35e52af1-2297-446a-a0ca-bb4d718783c6-kube-api-access-lqffq\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.258703 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-catalog-content\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.258740 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-utilities\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.259203 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-utilities\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.259516 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-catalog-content\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.291052 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqffq\" (UniqueName: \"kubernetes.io/projected/35e52af1-2297-446a-a0ca-bb4d718783c6-kube-api-access-lqffq\") pod \"certified-operators-f9dmg\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.360556 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-config\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.360605 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-dns-svc\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.360651 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lfrp\" (UniqueName: \"kubernetes.io/projected/67a737a1-81e1-4999-9d7d-24f2cbcb9682-kube-api-access-6lfrp\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.444541 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cdfb76457-cr7br"] Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.463116 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-config\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.463154 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-dns-svc\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.463200 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lfrp\" (UniqueName: \"kubernetes.io/projected/67a737a1-81e1-4999-9d7d-24f2cbcb9682-kube-api-access-6lfrp\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.464083 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-config\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.464146 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-dns-svc\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.464596 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f67ddf84f-r6lmd"] Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.468389 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.476582 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f67ddf84f-r6lmd"] Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.485113 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.505820 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lfrp\" (UniqueName: \"kubernetes.io/projected/67a737a1-81e1-4999-9d7d-24f2cbcb9682-kube-api-access-6lfrp\") pod \"dnsmasq-dns-5847c87bfc-nlp42\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.530070 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.568630 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-dns-svc\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.568746 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-config\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.568853 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7m2\" (UniqueName: \"kubernetes.io/projected/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-kube-api-access-6q7m2\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.671781 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7m2\" (UniqueName: \"kubernetes.io/projected/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-kube-api-access-6q7m2\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.672143 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-dns-svc\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.672202 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-config\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.673140 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-config\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.674040 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-dns-svc\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.688962 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7m2\" (UniqueName: \"kubernetes.io/projected/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-kube-api-access-6q7m2\") pod \"dnsmasq-dns-5f67ddf84f-r6lmd\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:38 crc kubenswrapper[4728]: I0125 05:52:38.813856 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.081835 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5847c87bfc-nlp42"] Jan 25 05:52:39 crc kubenswrapper[4728]: W0125 05:52:39.088445 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67a737a1_81e1_4999_9d7d_24f2cbcb9682.slice/crio-94620b5e54eafe6d781e8c4bf191a45f56d3dc2b6f8d6ae3bc9d00c21e0e701d WatchSource:0}: Error finding container 94620b5e54eafe6d781e8c4bf191a45f56d3dc2b6f8d6ae3bc9d00c21e0e701d: Status 404 returned error can't find the container with id 94620b5e54eafe6d781e8c4bf191a45f56d3dc2b6f8d6ae3bc9d00c21e0e701d Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.162364 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9dmg"] Jan 25 05:52:39 crc kubenswrapper[4728]: W0125 05:52:39.163665 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35e52af1_2297_446a_a0ca_bb4d718783c6.slice/crio-db876a6c67c5822cf103011337fad38e37129ddf73c77738b768eeba6a3b34a5 WatchSource:0}: Error finding container db876a6c67c5822cf103011337fad38e37129ddf73c77738b768eeba6a3b34a5: Status 404 returned error can't find the container with id db876a6c67c5822cf103011337fad38e37129ddf73c77738b768eeba6a3b34a5 Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.251052 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f67ddf84f-r6lmd"] Jan 25 05:52:39 crc kubenswrapper[4728]: W0125 05:52:39.262203 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c25a9b6_42e8_4e9f_8c23_597279ad0d03.slice/crio-8819bceca0ea0cbf768c10ab102158f8f9772f8133f916dacb2c9b47fd8ba5f8 WatchSource:0}: Error finding container 8819bceca0ea0cbf768c10ab102158f8f9772f8133f916dacb2c9b47fd8ba5f8: Status 404 returned error can't find the container with id 8819bceca0ea0cbf768c10ab102158f8f9772f8133f916dacb2c9b47fd8ba5f8 Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.357967 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.360739 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.365009 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.365162 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.365342 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.365409 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.365440 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.365594 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7x6cq" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.365717 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.367487 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.374908 4728 generic.go:334] "Generic (PLEG): container finished" podID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerID="509a1318aeb8bf578949e338c287c60cdba78db2bf59883101eeee10890042db" exitCode=0 Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.375010 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9dmg" event={"ID":"35e52af1-2297-446a-a0ca-bb4d718783c6","Type":"ContainerDied","Data":"509a1318aeb8bf578949e338c287c60cdba78db2bf59883101eeee10890042db"} Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.375030 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9dmg" event={"ID":"35e52af1-2297-446a-a0ca-bb4d718783c6","Type":"ContainerStarted","Data":"db876a6c67c5822cf103011337fad38e37129ddf73c77738b768eeba6a3b34a5"} Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.376517 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" event={"ID":"67a737a1-81e1-4999-9d7d-24f2cbcb9682","Type":"ContainerStarted","Data":"94620b5e54eafe6d781e8c4bf191a45f56d3dc2b6f8d6ae3bc9d00c21e0e701d"} Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.378774 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" event={"ID":"7c25a9b6-42e8-4e9f-8c23-597279ad0d03","Type":"ContainerStarted","Data":"8819bceca0ea0cbf768c10ab102158f8f9772f8133f916dacb2c9b47fd8ba5f8"} Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485196 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485269 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485328 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485360 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485470 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ddd3d99e-20c0-4133-9537-413f83a04edb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485599 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-config-data\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485620 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485687 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485717 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ddd3d99e-20c0-4133-9537-413f83a04edb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485767 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.485787 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52qg8\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-kube-api-access-52qg8\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592362 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592408 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592444 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592472 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592501 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ddd3d99e-20c0-4133-9537-413f83a04edb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592535 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-config-data\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592551 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592577 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592595 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ddd3d99e-20c0-4133-9537-413f83a04edb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592614 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.592631 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52qg8\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-kube-api-access-52qg8\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.593110 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.594066 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-config-data\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.594629 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.594688 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.596038 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.599797 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.599953 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.600239 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.600915 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.601213 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.601235 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.601345 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.601368 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.601498 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-k9w2b" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.601780 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.605127 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.606944 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ddd3d99e-20c0-4133-9537-413f83a04edb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.607256 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.610717 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52qg8\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-kube-api-access-52qg8\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.615016 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.621124 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ddd3d99e-20c0-4133-9537-413f83a04edb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.628553 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.693676 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.693741 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.693802 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.693827 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.693862 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.693880 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.693939 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.693982 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.694420 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.694519 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfxd8\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-kube-api-access-rfxd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.694616 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.703878 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797643 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797722 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797766 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797827 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797852 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797884 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797900 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797953 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797968 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.797991 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.798008 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfxd8\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-kube-api-access-rfxd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.800581 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.801385 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.802060 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.802768 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.803310 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.803419 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.806988 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.807407 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.813695 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.823444 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfxd8\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-kube-api-access-rfxd8\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.836107 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.850480 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:39 crc kubenswrapper[4728]: I0125 05:52:39.982569 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.147745 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.181704 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.394535 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2","Type":"ContainerStarted","Data":"71097818bb7edee0d694a7e3e899c036a69ba9e86dc9332774d6cc69365c5367"} Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.397539 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ddd3d99e-20c0-4133-9537-413f83a04edb","Type":"ContainerStarted","Data":"1c37604a4a59ee8a403c1782230219c6fca112ca4f3ba003e81afc4fd6307dc6"} Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.401386 4728 generic.go:334] "Generic (PLEG): container finished" podID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerID="58439ab70faec36a73f783a75a3e954e7ee34173b8e6bd8d1232c68bdb72be76" exitCode=0 Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.401433 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9dmg" event={"ID":"35e52af1-2297-446a-a0ca-bb4d718783c6","Type":"ContainerDied","Data":"58439ab70faec36a73f783a75a3e954e7ee34173b8e6bd8d1232c68bdb72be76"} Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.738870 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.743243 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.745874 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tpjcv" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.746964 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.746982 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.747016 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.752738 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.761442 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.817758 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/04585afa-3da7-4da9-896a-2acc02ff910e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.817830 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.817851 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04585afa-3da7-4da9-896a-2acc02ff910e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.817901 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-kolla-config\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.817933 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/04585afa-3da7-4da9-896a-2acc02ff910e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.817958 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.818017 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzcnd\" (UniqueName: \"kubernetes.io/projected/04585afa-3da7-4da9-896a-2acc02ff910e-kube-api-access-lzcnd\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.818053 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-config-data-default\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919051 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919086 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04585afa-3da7-4da9-896a-2acc02ff910e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919135 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-kolla-config\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919178 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/04585afa-3da7-4da9-896a-2acc02ff910e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919200 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919596 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919648 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/04585afa-3da7-4da9-896a-2acc02ff910e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919845 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-kolla-config\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.919971 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzcnd\" (UniqueName: \"kubernetes.io/projected/04585afa-3da7-4da9-896a-2acc02ff910e-kube-api-access-lzcnd\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.920162 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-config-data-default\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.920244 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/04585afa-3da7-4da9-896a-2acc02ff910e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.921101 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-config-data-default\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.923607 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04585afa-3da7-4da9-896a-2acc02ff910e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.926785 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/04585afa-3da7-4da9-896a-2acc02ff910e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.933506 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04585afa-3da7-4da9-896a-2acc02ff910e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.934494 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzcnd\" (UniqueName: \"kubernetes.io/projected/04585afa-3da7-4da9-896a-2acc02ff910e-kube-api-access-lzcnd\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:40 crc kubenswrapper[4728]: I0125 05:52:40.940025 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"04585afa-3da7-4da9-896a-2acc02ff910e\") " pod="openstack/openstack-galera-0" Jan 25 05:52:41 crc kubenswrapper[4728]: I0125 05:52:41.060461 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 25 05:52:41 crc kubenswrapper[4728]: I0125 05:52:41.419984 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9dmg" event={"ID":"35e52af1-2297-446a-a0ca-bb4d718783c6","Type":"ContainerStarted","Data":"b25048e2a718805be8f75f573ee3b91ce477eada638aba52c29ec5046512ef46"} Jan 25 05:52:41 crc kubenswrapper[4728]: I0125 05:52:41.437600 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f9dmg" podStartSLOduration=1.936275314 podStartE2EDuration="3.437581578s" podCreationTimestamp="2026-01-25 05:52:38 +0000 UTC" firstStartedPulling="2026-01-25 05:52:39.377130088 +0000 UTC m=+850.413008068" lastFinishedPulling="2026-01-25 05:52:40.878436352 +0000 UTC m=+851.914314332" observedRunningTime="2026-01-25 05:52:41.433629396 +0000 UTC m=+852.469507376" watchObservedRunningTime="2026-01-25 05:52:41.437581578 +0000 UTC m=+852.473459558" Jan 25 05:52:41 crc kubenswrapper[4728]: I0125 05:52:41.639874 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.221371 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.223345 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.225419 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.226385 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.226796 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.226827 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-lwd2s" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.227249 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.365933 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.366043 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.366069 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/847abeb1-1f82-44cc-a876-2b8787688696-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.366222 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.366259 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/847abeb1-1f82-44cc-a876-2b8787688696-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.366393 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.366471 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trl5j\" (UniqueName: \"kubernetes.io/projected/847abeb1-1f82-44cc-a876-2b8787688696-kube-api-access-trl5j\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.366658 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847abeb1-1f82-44cc-a876-2b8787688696-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.433115 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"04585afa-3da7-4da9-896a-2acc02ff910e","Type":"ContainerStarted","Data":"e1d4fb5de1585fe77c585ade5268bbf791dbccf40b636bab943e3ea28caeec7e"} Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.468463 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.468595 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trl5j\" (UniqueName: \"kubernetes.io/projected/847abeb1-1f82-44cc-a876-2b8787688696-kube-api-access-trl5j\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.468665 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847abeb1-1f82-44cc-a876-2b8787688696-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.468916 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.468941 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.468992 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/847abeb1-1f82-44cc-a876-2b8787688696-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.469033 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.469069 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/847abeb1-1f82-44cc-a876-2b8787688696-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.469839 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.470449 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/847abeb1-1f82-44cc-a876-2b8787688696-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.472652 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.474846 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.479410 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/847abeb1-1f82-44cc-a876-2b8787688696-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.500914 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/847abeb1-1f82-44cc-a876-2b8787688696-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.501893 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847abeb1-1f82-44cc-a876-2b8787688696-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.522355 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trl5j\" (UniqueName: \"kubernetes.io/projected/847abeb1-1f82-44cc-a876-2b8787688696-kube-api-access-trl5j\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.546544 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"847abeb1-1f82-44cc-a876-2b8787688696\") " pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.559876 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.576685 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.576797 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.582059 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-s2h9w" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.582441 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.582565 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.672509 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162db63-7667-482e-a9bd-174365a318cc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.672586 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5162db63-7667-482e-a9bd-174365a318cc-kolla-config\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.672609 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5162db63-7667-482e-a9bd-174365a318cc-config-data\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.672630 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbhfn\" (UniqueName: \"kubernetes.io/projected/5162db63-7667-482e-a9bd-174365a318cc-kube-api-access-rbhfn\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.672816 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162db63-7667-482e-a9bd-174365a318cc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.775032 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162db63-7667-482e-a9bd-174365a318cc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.775107 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5162db63-7667-482e-a9bd-174365a318cc-kolla-config\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.775127 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5162db63-7667-482e-a9bd-174365a318cc-config-data\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.775147 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbhfn\" (UniqueName: \"kubernetes.io/projected/5162db63-7667-482e-a9bd-174365a318cc-kube-api-access-rbhfn\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.775205 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162db63-7667-482e-a9bd-174365a318cc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.775847 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5162db63-7667-482e-a9bd-174365a318cc-kolla-config\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.779456 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5162db63-7667-482e-a9bd-174365a318cc-config-data\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.784853 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162db63-7667-482e-a9bd-174365a318cc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.785280 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162db63-7667-482e-a9bd-174365a318cc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.791491 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbhfn\" (UniqueName: \"kubernetes.io/projected/5162db63-7667-482e-a9bd-174365a318cc-kube-api-access-rbhfn\") pod \"memcached-0\" (UID: \"5162db63-7667-482e-a9bd-174365a318cc\") " pod="openstack/memcached-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.851178 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.900304 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.900370 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.900407 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.900808 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b2523b29483494490949dbc53c4bdb9d3c9b4b7a93fe4055f11cc91a7d873b4"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.900856 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://5b2523b29483494490949dbc53c4bdb9d3c9b4b7a93fe4055f11cc91a7d873b4" gracePeriod=600 Jan 25 05:52:42 crc kubenswrapper[4728]: I0125 05:52:42.913577 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 25 05:52:43 crc kubenswrapper[4728]: I0125 05:52:43.343614 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 25 05:52:43 crc kubenswrapper[4728]: W0125 05:52:43.360180 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5162db63_7667_482e_a9bd_174365a318cc.slice/crio-060f9d48f10329f50e1705245452cf06f28baacf441db8ebe859e47f9a2597bc WatchSource:0}: Error finding container 060f9d48f10329f50e1705245452cf06f28baacf441db8ebe859e47f9a2597bc: Status 404 returned error can't find the container with id 060f9d48f10329f50e1705245452cf06f28baacf441db8ebe859e47f9a2597bc Jan 25 05:52:43 crc kubenswrapper[4728]: I0125 05:52:43.365797 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 25 05:52:43 crc kubenswrapper[4728]: I0125 05:52:43.443991 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"847abeb1-1f82-44cc-a876-2b8787688696","Type":"ContainerStarted","Data":"b86a09da5789cd9c1a292d556bdfc2c26a6dc368989ee19c67ccadea843a4820"} Jan 25 05:52:43 crc kubenswrapper[4728]: I0125 05:52:43.448982 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5162db63-7667-482e-a9bd-174365a318cc","Type":"ContainerStarted","Data":"060f9d48f10329f50e1705245452cf06f28baacf441db8ebe859e47f9a2597bc"} Jan 25 05:52:43 crc kubenswrapper[4728]: I0125 05:52:43.451865 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="5b2523b29483494490949dbc53c4bdb9d3c9b4b7a93fe4055f11cc91a7d873b4" exitCode=0 Jan 25 05:52:43 crc kubenswrapper[4728]: I0125 05:52:43.451911 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"5b2523b29483494490949dbc53c4bdb9d3c9b4b7a93fe4055f11cc91a7d873b4"} Jan 25 05:52:43 crc kubenswrapper[4728]: I0125 05:52:43.451946 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"cfcdf54d823ad6beb0133f29e917610e444d0fa6cfe06f430b6751fe7dbea675"} Jan 25 05:52:43 crc kubenswrapper[4728]: I0125 05:52:43.451965 4728 scope.go:117] "RemoveContainer" containerID="4869a4031b431cc23a01935621e0bf0cd63107d4d6edf2fef74234f9435dad57" Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.062155 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.063576 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.065885 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8t4n8" Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.068494 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.208063 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf5kb\" (UniqueName: \"kubernetes.io/projected/2a691f37-33ba-4d5b-988a-f8417e8e630b-kube-api-access-wf5kb\") pod \"kube-state-metrics-0\" (UID: \"2a691f37-33ba-4d5b-988a-f8417e8e630b\") " pod="openstack/kube-state-metrics-0" Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.310341 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf5kb\" (UniqueName: \"kubernetes.io/projected/2a691f37-33ba-4d5b-988a-f8417e8e630b-kube-api-access-wf5kb\") pod \"kube-state-metrics-0\" (UID: \"2a691f37-33ba-4d5b-988a-f8417e8e630b\") " pod="openstack/kube-state-metrics-0" Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.332076 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf5kb\" (UniqueName: \"kubernetes.io/projected/2a691f37-33ba-4d5b-988a-f8417e8e630b-kube-api-access-wf5kb\") pod \"kube-state-metrics-0\" (UID: \"2a691f37-33ba-4d5b-988a-f8417e8e630b\") " pod="openstack/kube-state-metrics-0" Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.387514 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 25 05:52:44 crc kubenswrapper[4728]: I0125 05:52:44.859151 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:52:45 crc kubenswrapper[4728]: I0125 05:52:45.474966 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2a691f37-33ba-4d5b-988a-f8417e8e630b","Type":"ContainerStarted","Data":"e135b18907eb20ada58cea6f074df86d4b19c797c99361bdfffde6ebb2e5693f"} Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.061439 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6x4kp"] Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.063128 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.065782 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-kz5k7" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.066155 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.066483 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.088775 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6x4kp"] Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.106632 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-mr9hh"] Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.108412 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.117244 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mr9hh"] Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172072 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-lib\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172112 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-log-ovn\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172152 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b75ba-c337-4422-88ce-aace97ac7638-combined-ca-bundle\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172170 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-run-ovn\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172246 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-run\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172262 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6d8c\" (UniqueName: \"kubernetes.io/projected/8a1328da-8d1f-4f1e-9f8b-d61559200740-kube-api-access-n6d8c\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172398 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/193b75ba-c337-4422-88ce-aace97ac7638-ovn-controller-tls-certs\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172449 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-run\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172472 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a1328da-8d1f-4f1e-9f8b-d61559200740-scripts\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172495 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8qlf\" (UniqueName: \"kubernetes.io/projected/193b75ba-c337-4422-88ce-aace97ac7638-kube-api-access-h8qlf\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172571 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-log\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172603 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-etc-ovs\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.172691 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/193b75ba-c337-4422-88ce-aace97ac7638-scripts\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.275764 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-etc-ovs\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.275863 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/193b75ba-c337-4422-88ce-aace97ac7638-scripts\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.275940 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-lib\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.275976 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-log-ovn\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.276056 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b75ba-c337-4422-88ce-aace97ac7638-combined-ca-bundle\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.276079 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-run-ovn\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.276137 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-run\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.276156 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6d8c\" (UniqueName: \"kubernetes.io/projected/8a1328da-8d1f-4f1e-9f8b-d61559200740-kube-api-access-n6d8c\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.276260 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/193b75ba-c337-4422-88ce-aace97ac7638-ovn-controller-tls-certs\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.276284 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8qlf\" (UniqueName: \"kubernetes.io/projected/193b75ba-c337-4422-88ce-aace97ac7638-kube-api-access-h8qlf\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.276302 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-run\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.276343 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a1328da-8d1f-4f1e-9f8b-d61559200740-scripts\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.277019 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-run-ovn\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.277624 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-log\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.277953 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-run\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.278795 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-etc-ovs\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.278801 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-log\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.278866 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-run\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.278946 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/193b75ba-c337-4422-88ce-aace97ac7638-var-log-ovn\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.278986 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8a1328da-8d1f-4f1e-9f8b-d61559200740-var-lib\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.280621 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a1328da-8d1f-4f1e-9f8b-d61559200740-scripts\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.281891 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/193b75ba-c337-4422-88ce-aace97ac7638-scripts\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.285065 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/193b75ba-c337-4422-88ce-aace97ac7638-ovn-controller-tls-certs\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.289501 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b75ba-c337-4422-88ce-aace97ac7638-combined-ca-bundle\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.293799 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8qlf\" (UniqueName: \"kubernetes.io/projected/193b75ba-c337-4422-88ce-aace97ac7638-kube-api-access-h8qlf\") pod \"ovn-controller-6x4kp\" (UID: \"193b75ba-c337-4422-88ce-aace97ac7638\") " pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.306115 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6d8c\" (UniqueName: \"kubernetes.io/projected/8a1328da-8d1f-4f1e-9f8b-d61559200740-kube-api-access-n6d8c\") pod \"ovn-controller-ovs-mr9hh\" (UID: \"8a1328da-8d1f-4f1e-9f8b-d61559200740\") " pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.404661 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6x4kp" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.421611 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.486507 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.489848 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:48 crc kubenswrapper[4728]: I0125 05:52:48.527948 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.371183 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.373665 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.374016 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.377233 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.377819 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wvjpm" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.377856 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.377932 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.378029 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.408671 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.408807 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f6d754-4e79-4f05-9986-4abde93d34f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.408841 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.408887 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf8pc\" (UniqueName: \"kubernetes.io/projected/52f6d754-4e79-4f05-9986-4abde93d34f0-kube-api-access-jf8pc\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.408970 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52f6d754-4e79-4f05-9986-4abde93d34f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.409042 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.409072 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.409150 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f6d754-4e79-4f05-9986-4abde93d34f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.441195 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6x4kp"] Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.511910 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.512284 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.512339 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f6d754-4e79-4f05-9986-4abde93d34f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.512367 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.512411 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf8pc\" (UniqueName: \"kubernetes.io/projected/52f6d754-4e79-4f05-9986-4abde93d34f0-kube-api-access-jf8pc\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.512487 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52f6d754-4e79-4f05-9986-4abde93d34f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.512522 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.512543 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.512591 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f6d754-4e79-4f05-9986-4abde93d34f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.513277 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52f6d754-4e79-4f05-9986-4abde93d34f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.513774 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f6d754-4e79-4f05-9986-4abde93d34f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.513796 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6x4kp" event={"ID":"193b75ba-c337-4422-88ce-aace97ac7638","Type":"ContainerStarted","Data":"0b77c73bb2cf0a46640ca0b03cc0d13833c308cc4cd43602ffed3c534cc2aac1"} Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.514164 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f6d754-4e79-4f05-9986-4abde93d34f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.518807 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.519129 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2a691f37-33ba-4d5b-988a-f8417e8e630b","Type":"ContainerStarted","Data":"a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc"} Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.519189 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.524023 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.535545 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.359698384 podStartE2EDuration="5.535535621s" podCreationTimestamp="2026-01-25 05:52:44 +0000 UTC" firstStartedPulling="2026-01-25 05:52:44.874477471 +0000 UTC m=+855.910355451" lastFinishedPulling="2026-01-25 05:52:49.050314718 +0000 UTC m=+860.086192688" observedRunningTime="2026-01-25 05:52:49.529936302 +0000 UTC m=+860.565814282" watchObservedRunningTime="2026-01-25 05:52:49.535535621 +0000 UTC m=+860.571413600" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.538893 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf8pc\" (UniqueName: \"kubernetes.io/projected/52f6d754-4e79-4f05-9986-4abde93d34f0-kube-api-access-jf8pc\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.539930 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f6d754-4e79-4f05-9986-4abde93d34f0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.556519 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"52f6d754-4e79-4f05-9986-4abde93d34f0\") " pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.593107 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.642074 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9dmg"] Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.681112 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mr9hh"] Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.720748 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 25 05:52:49 crc kubenswrapper[4728]: I0125 05:52:49.982806 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.439389 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.441286 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.444629 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.444789 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.444973 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9rwqr" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.456705 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.467569 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.513796 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-n9vwb"] Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.517505 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.527657 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.573594 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n9vwb"] Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.604896 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mr9hh" event={"ID":"8a1328da-8d1f-4f1e-9f8b-d61559200740","Type":"ContainerStarted","Data":"feb58543a2b023ee2d30758b1b83aebed49ecf326927e64b7a88e4b3e765e5da"} Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.632953 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef293374-2620-4494-8bcf-6410e8a53342-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633006 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-ovs-rundir\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633037 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633077 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct5bp\" (UniqueName: \"kubernetes.io/projected/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-kube-api-access-ct5bp\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633102 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633138 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-ovn-rundir\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633155 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef293374-2620-4494-8bcf-6410e8a53342-config\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633178 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-combined-ca-bundle\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633224 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633247 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffdj\" (UniqueName: \"kubernetes.io/projected/ef293374-2620-4494-8bcf-6410e8a53342-kube-api-access-kffdj\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633270 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633292 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633331 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-config\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.633379 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef293374-2620-4494-8bcf-6410e8a53342-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.648280 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5847c87bfc-nlp42"] Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.672756 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f85787b9-kmmz7"] Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.674289 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.677853 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.688892 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f85787b9-kmmz7"] Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.735865 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef293374-2620-4494-8bcf-6410e8a53342-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.735919 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef293374-2620-4494-8bcf-6410e8a53342-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.735945 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-ovs-rundir\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.735971 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736013 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct5bp\" (UniqueName: \"kubernetes.io/projected/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-kube-api-access-ct5bp\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736040 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736088 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-ovn-rundir\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736103 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef293374-2620-4494-8bcf-6410e8a53342-config\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736129 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-combined-ca-bundle\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736169 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736189 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kffdj\" (UniqueName: \"kubernetes.io/projected/ef293374-2620-4494-8bcf-6410e8a53342-kube-api-access-kffdj\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736212 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736236 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736251 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-config\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.736974 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-config\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.737919 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-ovn-rundir\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.738193 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef293374-2620-4494-8bcf-6410e8a53342-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.738594 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef293374-2620-4494-8bcf-6410e8a53342-config\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.739087 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef293374-2620-4494-8bcf-6410e8a53342-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.739153 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-ovs-rundir\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.739757 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.749005 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.749588 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-combined-ca-bundle\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.749999 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.765377 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.769781 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct5bp\" (UniqueName: \"kubernetes.io/projected/37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5-kube-api-access-ct5bp\") pod \"ovn-controller-metrics-n9vwb\" (UID: \"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5\") " pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.800586 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef293374-2620-4494-8bcf-6410e8a53342-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.812106 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kffdj\" (UniqueName: \"kubernetes.io/projected/ef293374-2620-4494-8bcf-6410e8a53342-kube-api-access-kffdj\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.827192 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ef293374-2620-4494-8bcf-6410e8a53342\") " pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.837597 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-config\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.837697 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-dns-svc\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.837756 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-schcp\" (UniqueName: \"kubernetes.io/projected/8caa3c16-a992-4534-b406-bbac9de9baa7-kube-api-access-schcp\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.837800 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-ovsdbserver-nb\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.859227 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n9vwb" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.941171 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-config\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.941632 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-dns-svc\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.941668 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-schcp\" (UniqueName: \"kubernetes.io/projected/8caa3c16-a992-4534-b406-bbac9de9baa7-kube-api-access-schcp\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.941711 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-ovsdbserver-nb\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.942760 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-dns-svc\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.942879 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-ovsdbserver-nb\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.942985 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-config\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:50 crc kubenswrapper[4728]: I0125 05:52:50.959721 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-schcp\" (UniqueName: \"kubernetes.io/projected/8caa3c16-a992-4534-b406-bbac9de9baa7-kube-api-access-schcp\") pod \"dnsmasq-dns-54f85787b9-kmmz7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:51 crc kubenswrapper[4728]: I0125 05:52:51.006553 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:52:51 crc kubenswrapper[4728]: I0125 05:52:51.099809 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 25 05:52:51 crc kubenswrapper[4728]: I0125 05:52:51.603211 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f9dmg" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerName="registry-server" containerID="cri-o://b25048e2a718805be8f75f573ee3b91ce477eada638aba52c29ec5046512ef46" gracePeriod=2 Jan 25 05:52:52 crc kubenswrapper[4728]: I0125 05:52:52.613335 4728 generic.go:334] "Generic (PLEG): container finished" podID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerID="b25048e2a718805be8f75f573ee3b91ce477eada638aba52c29ec5046512ef46" exitCode=0 Jan 25 05:52:52 crc kubenswrapper[4728]: I0125 05:52:52.613416 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9dmg" event={"ID":"35e52af1-2297-446a-a0ca-bb4d718783c6","Type":"ContainerDied","Data":"b25048e2a718805be8f75f573ee3b91ce477eada638aba52c29ec5046512ef46"} Jan 25 05:52:52 crc kubenswrapper[4728]: W0125 05:52:52.630469 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f6d754_4e79_4f05_9986_4abde93d34f0.slice/crio-e65ab0c57faf4ba757cf70ceffc5c30e28fc38565bfe4715845b9bdc7b4f232f WatchSource:0}: Error finding container e65ab0c57faf4ba757cf70ceffc5c30e28fc38565bfe4715845b9bdc7b4f232f: Status 404 returned error can't find the container with id e65ab0c57faf4ba757cf70ceffc5c30e28fc38565bfe4715845b9bdc7b4f232f Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.239233 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.382028 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-catalog-content\") pod \"35e52af1-2297-446a-a0ca-bb4d718783c6\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.382168 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqffq\" (UniqueName: \"kubernetes.io/projected/35e52af1-2297-446a-a0ca-bb4d718783c6-kube-api-access-lqffq\") pod \"35e52af1-2297-446a-a0ca-bb4d718783c6\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.382295 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-utilities\") pod \"35e52af1-2297-446a-a0ca-bb4d718783c6\" (UID: \"35e52af1-2297-446a-a0ca-bb4d718783c6\") " Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.383048 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-utilities" (OuterVolumeSpecName: "utilities") pod "35e52af1-2297-446a-a0ca-bb4d718783c6" (UID: "35e52af1-2297-446a-a0ca-bb4d718783c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.387559 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e52af1-2297-446a-a0ca-bb4d718783c6-kube-api-access-lqffq" (OuterVolumeSpecName: "kube-api-access-lqffq") pod "35e52af1-2297-446a-a0ca-bb4d718783c6" (UID: "35e52af1-2297-446a-a0ca-bb4d718783c6"). InnerVolumeSpecName "kube-api-access-lqffq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.426095 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35e52af1-2297-446a-a0ca-bb4d718783c6" (UID: "35e52af1-2297-446a-a0ca-bb4d718783c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.484956 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.484980 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35e52af1-2297-446a-a0ca-bb4d718783c6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.484992 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqffq\" (UniqueName: \"kubernetes.io/projected/35e52af1-2297-446a-a0ca-bb4d718783c6-kube-api-access-lqffq\") on node \"crc\" DevicePath \"\"" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.626945 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9dmg" event={"ID":"35e52af1-2297-446a-a0ca-bb4d718783c6","Type":"ContainerDied","Data":"db876a6c67c5822cf103011337fad38e37129ddf73c77738b768eeba6a3b34a5"} Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.627018 4728 scope.go:117] "RemoveContainer" containerID="b25048e2a718805be8f75f573ee3b91ce477eada638aba52c29ec5046512ef46" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.627175 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9dmg" Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.631620 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f6d754-4e79-4f05-9986-4abde93d34f0","Type":"ContainerStarted","Data":"e65ab0c57faf4ba757cf70ceffc5c30e28fc38565bfe4715845b9bdc7b4f232f"} Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.665492 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9dmg"] Jan 25 05:52:53 crc kubenswrapper[4728]: I0125 05:52:53.670949 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f9dmg"] Jan 25 05:52:54 crc kubenswrapper[4728]: I0125 05:52:54.394147 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 25 05:52:55 crc kubenswrapper[4728]: I0125 05:52:55.336116 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" path="/var/lib/kubelet/pods/35e52af1-2297-446a-a0ca-bb4d718783c6/volumes" Jan 25 05:53:04 crc kubenswrapper[4728]: I0125 05:53:04.009064 4728 scope.go:117] "RemoveContainer" containerID="58439ab70faec36a73f783a75a3e954e7ee34173b8e6bd8d1232c68bdb72be76" Jan 25 05:53:04 crc kubenswrapper[4728]: I0125 05:53:04.406989 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f85787b9-kmmz7"] Jan 25 05:53:04 crc kubenswrapper[4728]: I0125 05:53:04.470123 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n9vwb"] Jan 25 05:53:04 crc kubenswrapper[4728]: W0125 05:53:04.506264 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8caa3c16_a992_4534_b406_bbac9de9baa7.slice/crio-f8e72773733b2c4b9883b23b47f809c3dec613a170937c86531a986910765e0e WatchSource:0}: Error finding container f8e72773733b2c4b9883b23b47f809c3dec613a170937c86531a986910765e0e: Status 404 returned error can't find the container with id f8e72773733b2c4b9883b23b47f809c3dec613a170937c86531a986910765e0e Jan 25 05:53:04 crc kubenswrapper[4728]: W0125 05:53:04.507506 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e45bde_b2c9_4617_a3e2_c0a1a5db3aa5.slice/crio-e64624fcc743467a994a3910765b4aaa993c9a053439e791a9d7357ad32ac7cb WatchSource:0}: Error finding container e64624fcc743467a994a3910765b4aaa993c9a053439e791a9d7357ad32ac7cb: Status 404 returned error can't find the container with id e64624fcc743467a994a3910765b4aaa993c9a053439e791a9d7357ad32ac7cb Jan 25 05:53:04 crc kubenswrapper[4728]: I0125 05:53:04.545801 4728 scope.go:117] "RemoveContainer" containerID="509a1318aeb8bf578949e338c287c60cdba78db2bf59883101eeee10890042db" Jan 25 05:53:04 crc kubenswrapper[4728]: I0125 05:53:04.548894 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 25 05:53:04 crc kubenswrapper[4728]: I0125 05:53:04.719147 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n9vwb" event={"ID":"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5","Type":"ContainerStarted","Data":"e64624fcc743467a994a3910765b4aaa993c9a053439e791a9d7357ad32ac7cb"} Jan 25 05:53:04 crc kubenswrapper[4728]: I0125 05:53:04.721559 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ef293374-2620-4494-8bcf-6410e8a53342","Type":"ContainerStarted","Data":"3e2b90428917620d90e58008eeac49e91a25fafea74c972c90cb019aee00b8a6"} Jan 25 05:53:04 crc kubenswrapper[4728]: I0125 05:53:04.723448 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" event={"ID":"8caa3c16-a992-4534-b406-bbac9de9baa7","Type":"ContainerStarted","Data":"f8e72773733b2c4b9883b23b47f809c3dec613a170937c86531a986910765e0e"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.732663 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5162db63-7667-482e-a9bd-174365a318cc","Type":"ContainerStarted","Data":"59e5f4d969f6dda0fcb57f96485b87e3fdf481a5b7cb2317d172ae06fb471813"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.733220 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.735725 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2","Type":"ContainerStarted","Data":"245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.737794 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"847abeb1-1f82-44cc-a876-2b8787688696","Type":"ContainerStarted","Data":"622fc02ec0f4a770e1886718cfbc1f58664d1e4cc9dc1eeb70563d4e6661945b"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.740491 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ddd3d99e-20c0-4133-9537-413f83a04edb","Type":"ContainerStarted","Data":"c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.742691 4728 generic.go:334] "Generic (PLEG): container finished" podID="8caa3c16-a992-4534-b406-bbac9de9baa7" containerID="b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934" exitCode=0 Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.742736 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" event={"ID":"8caa3c16-a992-4534-b406-bbac9de9baa7","Type":"ContainerDied","Data":"b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.764011 4728 generic.go:334] "Generic (PLEG): container finished" podID="30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252" containerID="bf195701a5b35d9eccb2753ed3ca7b261bedc0ab61cdef2acf3c1915e5e1534b" exitCode=0 Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.764108 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" event={"ID":"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252","Type":"ContainerDied","Data":"bf195701a5b35d9eccb2753ed3ca7b261bedc0ab61cdef2acf3c1915e5e1534b"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.764961 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.610857003 podStartE2EDuration="23.764945596s" podCreationTimestamp="2026-01-25 05:52:42 +0000 UTC" firstStartedPulling="2026-01-25 05:52:43.364859905 +0000 UTC m=+854.400737884" lastFinishedPulling="2026-01-25 05:53:04.518948498 +0000 UTC m=+875.554826477" observedRunningTime="2026-01-25 05:53:05.755760066 +0000 UTC m=+876.791638046" watchObservedRunningTime="2026-01-25 05:53:05.764945596 +0000 UTC m=+876.800823576" Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.768878 4728 generic.go:334] "Generic (PLEG): container finished" podID="67a737a1-81e1-4999-9d7d-24f2cbcb9682" containerID="775f9fc4217e30c5b77d7abb6e613a66805b84993320b8e8fed4f318a8e10f88" exitCode=0 Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.768982 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" event={"ID":"67a737a1-81e1-4999-9d7d-24f2cbcb9682","Type":"ContainerDied","Data":"775f9fc4217e30c5b77d7abb6e613a66805b84993320b8e8fed4f318a8e10f88"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.775014 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6x4kp" event={"ID":"193b75ba-c337-4422-88ce-aace97ac7638","Type":"ContainerStarted","Data":"d7fc2bc9f3216234aa5ea3713854ffbef9215e81ecbee278d0f73d0e4fe59d06"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.775736 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6x4kp" Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.780474 4728 generic.go:334] "Generic (PLEG): container finished" podID="6f9eb7e1-0883-4fcd-a869-064a777694bf" containerID="b892fb6fddf186cdf9e9801ea78e575e2e711aef72c0ce09d81e3d46e8149fc6" exitCode=0 Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.780512 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cdfb76457-cr7br" event={"ID":"6f9eb7e1-0883-4fcd-a869-064a777694bf","Type":"ContainerDied","Data":"b892fb6fddf186cdf9e9801ea78e575e2e711aef72c0ce09d81e3d46e8149fc6"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.784158 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"04585afa-3da7-4da9-896a-2acc02ff910e","Type":"ContainerStarted","Data":"ccd289b98058195c1a8cae65f0b53e3855b8bf3164ac0ca963f5452a507af6cc"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.785856 4728 generic.go:334] "Generic (PLEG): container finished" podID="8a1328da-8d1f-4f1e-9f8b-d61559200740" containerID="8a836c8bcb839244fd6fb9221cb9cdd916c004495766ca06fd36bc5634ac1ac5" exitCode=0 Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.785906 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mr9hh" event={"ID":"8a1328da-8d1f-4f1e-9f8b-d61559200740","Type":"ContainerDied","Data":"8a836c8bcb839244fd6fb9221cb9cdd916c004495766ca06fd36bc5634ac1ac5"} Jan 25 05:53:05 crc kubenswrapper[4728]: I0125 05:53:05.936215 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6x4kp" podStartSLOduration=2.844364845 podStartE2EDuration="17.936193044s" podCreationTimestamp="2026-01-25 05:52:48 +0000 UTC" firstStartedPulling="2026-01-25 05:52:49.452469318 +0000 UTC m=+860.488347288" lastFinishedPulling="2026-01-25 05:53:04.544297507 +0000 UTC m=+875.580175487" observedRunningTime="2026-01-25 05:53:05.933111474 +0000 UTC m=+876.968989453" watchObservedRunningTime="2026-01-25 05:53:05.936193044 +0000 UTC m=+876.972071024" Jan 25 05:53:06 crc kubenswrapper[4728]: I0125 05:53:06.796122 4728 generic.go:334] "Generic (PLEG): container finished" podID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" containerID="9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b" exitCode=0 Jan 25 05:53:06 crc kubenswrapper[4728]: I0125 05:53:06.796283 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" event={"ID":"7c25a9b6-42e8-4e9f-8c23-597279ad0d03","Type":"ContainerDied","Data":"9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b"} Jan 25 05:53:06 crc kubenswrapper[4728]: I0125 05:53:06.798619 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f6d754-4e79-4f05-9986-4abde93d34f0","Type":"ContainerStarted","Data":"4eb21912054741d7b49ac61308d9677c378f47947e3bb1aba753d498af722c7e"} Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.294985 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6g4g"] Jan 25 05:53:07 crc kubenswrapper[4728]: E0125 05:53:07.295395 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerName="registry-server" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.295414 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerName="registry-server" Jan 25 05:53:07 crc kubenswrapper[4728]: E0125 05:53:07.295442 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerName="extract-utilities" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.295449 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerName="extract-utilities" Jan 25 05:53:07 crc kubenswrapper[4728]: E0125 05:53:07.295466 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerName="extract-content" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.295472 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerName="extract-content" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.295633 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e52af1-2297-446a-a0ca-bb4d718783c6" containerName="registry-server" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.296733 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.300484 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6g4g"] Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.424602 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-utilities\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.424668 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-catalog-content\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.424738 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l77lg\" (UniqueName: \"kubernetes.io/projected/f132cd80-c760-445f-b6bf-41d35700b35c-kube-api-access-l77lg\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.526563 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-utilities\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.526654 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-catalog-content\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.526736 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l77lg\" (UniqueName: \"kubernetes.io/projected/f132cd80-c760-445f-b6bf-41d35700b35c-kube-api-access-l77lg\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.527199 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-utilities\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.527203 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-catalog-content\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.544247 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l77lg\" (UniqueName: \"kubernetes.io/projected/f132cd80-c760-445f-b6bf-41d35700b35c-kube-api-access-l77lg\") pod \"community-operators-j6g4g\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:07 crc kubenswrapper[4728]: I0125 05:53:07.612642 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.454251 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.460215 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.467695 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.557782 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-dns-svc\") pod \"6f9eb7e1-0883-4fcd-a869-064a777694bf\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.558055 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5l6q\" (UniqueName: \"kubernetes.io/projected/6f9eb7e1-0883-4fcd-a869-064a777694bf-kube-api-access-q5l6q\") pod \"6f9eb7e1-0883-4fcd-a869-064a777694bf\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.558114 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-config\") pod \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\" (UID: \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\") " Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.558190 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-config\") pod \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.558226 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lfrp\" (UniqueName: \"kubernetes.io/projected/67a737a1-81e1-4999-9d7d-24f2cbcb9682-kube-api-access-6lfrp\") pod \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.558295 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-dns-svc\") pod \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\" (UID: \"67a737a1-81e1-4999-9d7d-24f2cbcb9682\") " Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.558354 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-config\") pod \"6f9eb7e1-0883-4fcd-a869-064a777694bf\" (UID: \"6f9eb7e1-0883-4fcd-a869-064a777694bf\") " Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.558382 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrf5g\" (UniqueName: \"kubernetes.io/projected/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-kube-api-access-nrf5g\") pod \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\" (UID: \"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252\") " Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.561630 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a737a1-81e1-4999-9d7d-24f2cbcb9682-kube-api-access-6lfrp" (OuterVolumeSpecName: "kube-api-access-6lfrp") pod "67a737a1-81e1-4999-9d7d-24f2cbcb9682" (UID: "67a737a1-81e1-4999-9d7d-24f2cbcb9682"). InnerVolumeSpecName "kube-api-access-6lfrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.561907 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-kube-api-access-nrf5g" (OuterVolumeSpecName: "kube-api-access-nrf5g") pod "30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252" (UID: "30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252"). InnerVolumeSpecName "kube-api-access-nrf5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.562167 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9eb7e1-0883-4fcd-a869-064a777694bf-kube-api-access-q5l6q" (OuterVolumeSpecName: "kube-api-access-q5l6q") pod "6f9eb7e1-0883-4fcd-a869-064a777694bf" (UID: "6f9eb7e1-0883-4fcd-a869-064a777694bf"). InnerVolumeSpecName "kube-api-access-q5l6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.572864 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-config" (OuterVolumeSpecName: "config") pod "30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252" (UID: "30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.573023 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f9eb7e1-0883-4fcd-a869-064a777694bf" (UID: "6f9eb7e1-0883-4fcd-a869-064a777694bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.574105 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-config" (OuterVolumeSpecName: "config") pod "6f9eb7e1-0883-4fcd-a869-064a777694bf" (UID: "6f9eb7e1-0883-4fcd-a869-064a777694bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.574462 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-config" (OuterVolumeSpecName: "config") pod "67a737a1-81e1-4999-9d7d-24f2cbcb9682" (UID: "67a737a1-81e1-4999-9d7d-24f2cbcb9682"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.574522 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67a737a1-81e1-4999-9d7d-24f2cbcb9682" (UID: "67a737a1-81e1-4999-9d7d-24f2cbcb9682"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.659470 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.659495 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5l6q\" (UniqueName: \"kubernetes.io/projected/6f9eb7e1-0883-4fcd-a869-064a777694bf-kube-api-access-q5l6q\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.659508 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.659517 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.659525 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lfrp\" (UniqueName: \"kubernetes.io/projected/67a737a1-81e1-4999-9d7d-24f2cbcb9682-kube-api-access-6lfrp\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.659535 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a737a1-81e1-4999-9d7d-24f2cbcb9682-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.659543 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f9eb7e1-0883-4fcd-a869-064a777694bf-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.659551 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrf5g\" (UniqueName: \"kubernetes.io/projected/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252-kube-api-access-nrf5g\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.823604 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" event={"ID":"30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252","Type":"ContainerDied","Data":"e9d841c094cf45294730fdb0fb66112766ec6cb0cb532f5150306e91be1735f9"} Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.823630 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d89c95f8f-g82r5" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.823662 4728 scope.go:117] "RemoveContainer" containerID="bf195701a5b35d9eccb2753ed3ca7b261bedc0ab61cdef2acf3c1915e5e1534b" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.825567 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" event={"ID":"67a737a1-81e1-4999-9d7d-24f2cbcb9682","Type":"ContainerDied","Data":"94620b5e54eafe6d781e8c4bf191a45f56d3dc2b6f8d6ae3bc9d00c21e0e701d"} Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.825625 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5847c87bfc-nlp42" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.827232 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cdfb76457-cr7br" event={"ID":"6f9eb7e1-0883-4fcd-a869-064a777694bf","Type":"ContainerDied","Data":"889ad8ba30b061330a36d8975297ce4a3b969a7f5cde634c4de5fa0cd62acdf8"} Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.827340 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cdfb76457-cr7br" Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.829647 4728 generic.go:334] "Generic (PLEG): container finished" podID="04585afa-3da7-4da9-896a-2acc02ff910e" containerID="ccd289b98058195c1a8cae65f0b53e3855b8bf3164ac0ca963f5452a507af6cc" exitCode=0 Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.829694 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"04585afa-3da7-4da9-896a-2acc02ff910e","Type":"ContainerDied","Data":"ccd289b98058195c1a8cae65f0b53e3855b8bf3164ac0ca963f5452a507af6cc"} Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.890307 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5847c87bfc-nlp42"] Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.903024 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5847c87bfc-nlp42"] Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.922437 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d89c95f8f-g82r5"] Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.926301 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d89c95f8f-g82r5"] Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.934143 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cdfb76457-cr7br"] Jan 25 05:53:09 crc kubenswrapper[4728]: I0125 05:53:09.938132 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cdfb76457-cr7br"] Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.559545 4728 scope.go:117] "RemoveContainer" containerID="775f9fc4217e30c5b77d7abb6e613a66805b84993320b8e8fed4f318a8e10f88" Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.702121 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6g4g"] Jan 25 05:53:10 crc kubenswrapper[4728]: W0125 05:53:10.770837 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice/crio-30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8 WatchSource:0}: Error finding container 30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8: Status 404 returned error can't find the container with id 30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8 Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.801617 4728 scope.go:117] "RemoveContainer" containerID="b892fb6fddf186cdf9e9801ea78e575e2e711aef72c0ce09d81e3d46e8149fc6" Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.841935 4728 generic.go:334] "Generic (PLEG): container finished" podID="847abeb1-1f82-44cc-a876-2b8787688696" containerID="622fc02ec0f4a770e1886718cfbc1f58664d1e4cc9dc1eeb70563d4e6661945b" exitCode=0 Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.841997 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"847abeb1-1f82-44cc-a876-2b8787688696","Type":"ContainerDied","Data":"622fc02ec0f4a770e1886718cfbc1f58664d1e4cc9dc1eeb70563d4e6661945b"} Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.844742 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" event={"ID":"8caa3c16-a992-4534-b406-bbac9de9baa7","Type":"ContainerStarted","Data":"28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab"} Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.844846 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.846650 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6g4g" event={"ID":"f132cd80-c760-445f-b6bf-41d35700b35c","Type":"ContainerStarted","Data":"30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8"} Jan 25 05:53:10 crc kubenswrapper[4728]: I0125 05:53:10.884052 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" podStartSLOduration=20.884027445 podStartE2EDuration="20.884027445s" podCreationTimestamp="2026-01-25 05:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:53:10.878214564 +0000 UTC m=+881.914092544" watchObservedRunningTime="2026-01-25 05:53:10.884027445 +0000 UTC m=+881.919905425" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.338725 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252" path="/var/lib/kubelet/pods/30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252/volumes" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.339655 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a737a1-81e1-4999-9d7d-24f2cbcb9682" path="/var/lib/kubelet/pods/67a737a1-81e1-4999-9d7d-24f2cbcb9682/volumes" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.340170 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9eb7e1-0883-4fcd-a869-064a777694bf" path="/var/lib/kubelet/pods/6f9eb7e1-0883-4fcd-a869-064a777694bf/volumes" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.865473 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mr9hh" event={"ID":"8a1328da-8d1f-4f1e-9f8b-d61559200740","Type":"ContainerStarted","Data":"c3b9666303c29367c71c027bd9ea92a60d6c95a6bce59862c1307661b8e7324f"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.865872 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.865891 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mr9hh" event={"ID":"8a1328da-8d1f-4f1e-9f8b-d61559200740","Type":"ContainerStarted","Data":"d786363f3e30c4ad1b4b05319a8717342977a5bb102c9570703722bcd8d6557c"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.865906 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.867031 4728 generic.go:334] "Generic (PLEG): container finished" podID="f132cd80-c760-445f-b6bf-41d35700b35c" containerID="696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e" exitCode=0 Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.867102 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6g4g" event={"ID":"f132cd80-c760-445f-b6bf-41d35700b35c","Type":"ContainerDied","Data":"696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.869546 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n9vwb" event={"ID":"37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5","Type":"ContainerStarted","Data":"3bc6e48f206fce73a8555c76f2f629f4a278a8291c776b3a85061e7746badc11"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.873655 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f6d754-4e79-4f05-9986-4abde93d34f0","Type":"ContainerStarted","Data":"2e630f292dd14f4846e76a881fa4c3edd1fcaec7b0d307315d8e55e07ebe851c"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.875338 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"847abeb1-1f82-44cc-a876-2b8787688696","Type":"ContainerStarted","Data":"3729dda5b9552fa677794a66316463a369424135d6757331b71e733d5182598d"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.878133 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ef293374-2620-4494-8bcf-6410e8a53342","Type":"ContainerStarted","Data":"3e452f19fc72303692882a7335998999d27a425678a368980c1c9e541802ef90"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.878163 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ef293374-2620-4494-8bcf-6410e8a53342","Type":"ContainerStarted","Data":"94a930eb00a62f0bd35d3a874dd9e039ee281291271a025f850ce519e8bb5260"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.881649 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"04585afa-3da7-4da9-896a-2acc02ff910e","Type":"ContainerStarted","Data":"da2a89d1d57b22c49a4845147cfceea3ed532f2366f2e489b1e5d4a017c34489"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.884364 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" event={"ID":"7c25a9b6-42e8-4e9f-8c23-597279ad0d03","Type":"ContainerStarted","Data":"f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37"} Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.884402 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.892679 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-mr9hh" podStartSLOduration=9.06706967 podStartE2EDuration="23.892669231s" podCreationTimestamp="2026-01-25 05:52:48 +0000 UTC" firstStartedPulling="2026-01-25 05:52:49.693374545 +0000 UTC m=+860.729252525" lastFinishedPulling="2026-01-25 05:53:04.518974107 +0000 UTC m=+875.554852086" observedRunningTime="2026-01-25 05:53:11.888677304 +0000 UTC m=+882.924555284" watchObservedRunningTime="2026-01-25 05:53:11.892669231 +0000 UTC m=+882.928547211" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.903814 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" podStartSLOduration=8.574397596 podStartE2EDuration="33.903796613s" podCreationTimestamp="2026-01-25 05:52:38 +0000 UTC" firstStartedPulling="2026-01-25 05:52:39.264343655 +0000 UTC m=+850.300221635" lastFinishedPulling="2026-01-25 05:53:04.593742671 +0000 UTC m=+875.629620652" observedRunningTime="2026-01-25 05:53:11.901619458 +0000 UTC m=+882.937497437" watchObservedRunningTime="2026-01-25 05:53:11.903796613 +0000 UTC m=+882.939674593" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.916728 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.685819042 podStartE2EDuration="23.916712157s" podCreationTimestamp="2026-01-25 05:52:48 +0000 UTC" firstStartedPulling="2026-01-25 05:52:52.634601885 +0000 UTC m=+863.670479864" lastFinishedPulling="2026-01-25 05:53:10.865495 +0000 UTC m=+881.901372979" observedRunningTime="2026-01-25 05:53:11.916344373 +0000 UTC m=+882.952222363" watchObservedRunningTime="2026-01-25 05:53:11.916712157 +0000 UTC m=+882.952590137" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.944165 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.066536326 podStartE2EDuration="32.944150816s" podCreationTimestamp="2026-01-25 05:52:39 +0000 UTC" firstStartedPulling="2026-01-25 05:52:41.67033358 +0000 UTC m=+852.706211560" lastFinishedPulling="2026-01-25 05:53:04.54794807 +0000 UTC m=+875.583826050" observedRunningTime="2026-01-25 05:53:11.938243798 +0000 UTC m=+882.974121777" watchObservedRunningTime="2026-01-25 05:53:11.944150816 +0000 UTC m=+882.980028796" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.970073 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.832568454 podStartE2EDuration="30.970046255s" podCreationTimestamp="2026-01-25 05:52:41 +0000 UTC" firstStartedPulling="2026-01-25 05:52:43.381249439 +0000 UTC m=+854.417127419" lastFinishedPulling="2026-01-25 05:53:04.518727241 +0000 UTC m=+875.554605220" observedRunningTime="2026-01-25 05:53:11.966944075 +0000 UTC m=+883.002822054" watchObservedRunningTime="2026-01-25 05:53:11.970046255 +0000 UTC m=+883.005924224" Jan 25 05:53:11 crc kubenswrapper[4728]: I0125 05:53:11.982617 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.204779296 podStartE2EDuration="22.982603703s" podCreationTimestamp="2026-01-25 05:52:49 +0000 UTC" firstStartedPulling="2026-01-25 05:53:04.556564116 +0000 UTC m=+875.592442097" lastFinishedPulling="2026-01-25 05:53:10.334388524 +0000 UTC m=+881.370266504" observedRunningTime="2026-01-25 05:53:11.980068903 +0000 UTC m=+883.015946883" watchObservedRunningTime="2026-01-25 05:53:11.982603703 +0000 UTC m=+883.018481683" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.001125 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-n9vwb" podStartSLOduration=15.655746538 podStartE2EDuration="22.001106161s" podCreationTimestamp="2026-01-25 05:52:50 +0000 UTC" firstStartedPulling="2026-01-25 05:53:04.519008451 +0000 UTC m=+875.554886431" lastFinishedPulling="2026-01-25 05:53:10.864368074 +0000 UTC m=+881.900246054" observedRunningTime="2026-01-25 05:53:11.993546487 +0000 UTC m=+883.029424457" watchObservedRunningTime="2026-01-25 05:53:12.001106161 +0000 UTC m=+883.036984141" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.100909 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.395743 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f67ddf84f-r6lmd"] Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.423591 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7687f56cdc-9s4cq"] Jan 25 05:53:12 crc kubenswrapper[4728]: E0125 05:53:12.423981 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67a737a1-81e1-4999-9d7d-24f2cbcb9682" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.424004 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a737a1-81e1-4999-9d7d-24f2cbcb9682" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: E0125 05:53:12.424016 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9eb7e1-0883-4fcd-a869-064a777694bf" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.424022 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9eb7e1-0883-4fcd-a869-064a777694bf" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: E0125 05:53:12.424051 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.424056 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.424250 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b4c07b-aa8a-4a7e-b7cf-4d16d2fe3252" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.424277 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="67a737a1-81e1-4999-9d7d-24f2cbcb9682" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.424287 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9eb7e1-0883-4fcd-a869-064a777694bf" containerName="init" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.425170 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.427218 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.436283 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7687f56cdc-9s4cq"] Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.514629 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-dns-svc\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.514844 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.514917 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.514952 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmdsq\" (UniqueName: \"kubernetes.io/projected/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-kube-api-access-nmdsq\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.515002 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-config\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.618360 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-dns-svc\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.618584 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.618635 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.618655 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmdsq\" (UniqueName: \"kubernetes.io/projected/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-kube-api-access-nmdsq\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.618691 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-config\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.619550 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-config\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.619549 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-dns-svc\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.620121 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.620575 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.638948 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmdsq\" (UniqueName: \"kubernetes.io/projected/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-kube-api-access-nmdsq\") pod \"dnsmasq-dns-7687f56cdc-9s4cq\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.739435 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.852471 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.852675 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.896296 4728 generic.go:334] "Generic (PLEG): container finished" podID="f132cd80-c760-445f-b6bf-41d35700b35c" containerID="6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7" exitCode=0 Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.896405 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6g4g" event={"ID":"f132cd80-c760-445f-b6bf-41d35700b35c","Type":"ContainerDied","Data":"6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7"} Jan 25 05:53:12 crc kubenswrapper[4728]: I0125 05:53:12.915632 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.146228 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7687f56cdc-9s4cq"] Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.721554 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.753924 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.909183 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6g4g" event={"ID":"f132cd80-c760-445f-b6bf-41d35700b35c","Type":"ContainerStarted","Data":"2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864"} Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.912939 4728 generic.go:334] "Generic (PLEG): container finished" podID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" containerID="0eac5d66783250a895aca1e0f0d00fdbc69f4b5141d9266103b50770240e5770" exitCode=0 Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.913423 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" event={"ID":"85b49ae9-5410-4a73-a256-ea55dd3d1bfe","Type":"ContainerDied","Data":"0eac5d66783250a895aca1e0f0d00fdbc69f4b5141d9266103b50770240e5770"} Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.913466 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" event={"ID":"85b49ae9-5410-4a73-a256-ea55dd3d1bfe","Type":"ContainerStarted","Data":"28cedffc06a704f12a04cc1cca89e14dafba9414a5a7074a66ade9ad46231fb2"} Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.913808 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.913969 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" podUID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" containerName="dnsmasq-dns" containerID="cri-o://f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37" gracePeriod=10 Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.933893 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6g4g" podStartSLOduration=5.441028317 podStartE2EDuration="6.933872917s" podCreationTimestamp="2026-01-25 05:53:07 +0000 UTC" firstStartedPulling="2026-01-25 05:53:11.868614442 +0000 UTC m=+882.904492422" lastFinishedPulling="2026-01-25 05:53:13.361459042 +0000 UTC m=+884.397337022" observedRunningTime="2026-01-25 05:53:13.931103725 +0000 UTC m=+884.966981705" watchObservedRunningTime="2026-01-25 05:53:13.933872917 +0000 UTC m=+884.969750897" Jan 25 05:53:13 crc kubenswrapper[4728]: I0125 05:53:13.957849 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.269419 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.328881 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f85787b9-kmmz7"] Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.329136 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" podUID="8caa3c16-a992-4534-b406-bbac9de9baa7" containerName="dnsmasq-dns" containerID="cri-o://28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab" gracePeriod=10 Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.357580 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb45645f7-6b8qk"] Jan 25 05:53:14 crc kubenswrapper[4728]: E0125 05:53:14.357902 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" containerName="init" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.357920 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" containerName="init" Jan 25 05:53:14 crc kubenswrapper[4728]: E0125 05:53:14.357953 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" containerName="dnsmasq-dns" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.357960 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" containerName="dnsmasq-dns" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.358087 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" containerName="dnsmasq-dns" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.358822 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.377843 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb45645f7-6b8qk"] Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.456247 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q7m2\" (UniqueName: \"kubernetes.io/projected/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-kube-api-access-6q7m2\") pod \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.456307 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-dns-svc\") pod \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.457058 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-config\") pod \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\" (UID: \"7c25a9b6-42e8-4e9f-8c23-597279ad0d03\") " Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.457186 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-config\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.457371 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mnjn\" (UniqueName: \"kubernetes.io/projected/f62a2680-ed4b-449b-925c-e243731ea8b4-kube-api-access-9mnjn\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.457409 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.457469 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.457499 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-dns-svc\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.461830 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-kube-api-access-6q7m2" (OuterVolumeSpecName: "kube-api-access-6q7m2") pod "7c25a9b6-42e8-4e9f-8c23-597279ad0d03" (UID: "7c25a9b6-42e8-4e9f-8c23-597279ad0d03"). InnerVolumeSpecName "kube-api-access-6q7m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.487500 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-config" (OuterVolumeSpecName: "config") pod "7c25a9b6-42e8-4e9f-8c23-597279ad0d03" (UID: "7c25a9b6-42e8-4e9f-8c23-597279ad0d03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.493274 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c25a9b6-42e8-4e9f-8c23-597279ad0d03" (UID: "7c25a9b6-42e8-4e9f-8c23-597279ad0d03"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.561643 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mnjn\" (UniqueName: \"kubernetes.io/projected/f62a2680-ed4b-449b-925c-e243731ea8b4-kube-api-access-9mnjn\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.561725 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.561808 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.561865 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-dns-svc\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.561925 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-config\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.562205 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.562217 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q7m2\" (UniqueName: \"kubernetes.io/projected/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-kube-api-access-6q7m2\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.562230 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25a9b6-42e8-4e9f-8c23-597279ad0d03-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.563070 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-config\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.563211 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-dns-svc\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.563661 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.563860 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.585271 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mnjn\" (UniqueName: \"kubernetes.io/projected/f62a2680-ed4b-449b-925c-e243731ea8b4-kube-api-access-9mnjn\") pod \"dnsmasq-dns-6cb45645f7-6b8qk\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.734695 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.784450 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.866510 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-dns-svc\") pod \"8caa3c16-a992-4534-b406-bbac9de9baa7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.866558 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-config\") pod \"8caa3c16-a992-4534-b406-bbac9de9baa7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.866618 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-schcp\" (UniqueName: \"kubernetes.io/projected/8caa3c16-a992-4534-b406-bbac9de9baa7-kube-api-access-schcp\") pod \"8caa3c16-a992-4534-b406-bbac9de9baa7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.866681 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-ovsdbserver-nb\") pod \"8caa3c16-a992-4534-b406-bbac9de9baa7\" (UID: \"8caa3c16-a992-4534-b406-bbac9de9baa7\") " Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.876115 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8caa3c16-a992-4534-b406-bbac9de9baa7-kube-api-access-schcp" (OuterVolumeSpecName: "kube-api-access-schcp") pod "8caa3c16-a992-4534-b406-bbac9de9baa7" (UID: "8caa3c16-a992-4534-b406-bbac9de9baa7"). InnerVolumeSpecName "kube-api-access-schcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.906054 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-config" (OuterVolumeSpecName: "config") pod "8caa3c16-a992-4534-b406-bbac9de9baa7" (UID: "8caa3c16-a992-4534-b406-bbac9de9baa7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.906749 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8caa3c16-a992-4534-b406-bbac9de9baa7" (UID: "8caa3c16-a992-4534-b406-bbac9de9baa7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.907992 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8caa3c16-a992-4534-b406-bbac9de9baa7" (UID: "8caa3c16-a992-4534-b406-bbac9de9baa7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.921609 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" event={"ID":"85b49ae9-5410-4a73-a256-ea55dd3d1bfe","Type":"ContainerStarted","Data":"21c8a9a55ff5f9701a8aac301599c839b9f46f54219f2c1f7e2595f19a047900"} Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.921703 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.923061 4728 generic.go:334] "Generic (PLEG): container finished" podID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" containerID="f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37" exitCode=0 Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.923110 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" event={"ID":"7c25a9b6-42e8-4e9f-8c23-597279ad0d03","Type":"ContainerDied","Data":"f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37"} Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.923133 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" event={"ID":"7c25a9b6-42e8-4e9f-8c23-597279ad0d03","Type":"ContainerDied","Data":"8819bceca0ea0cbf768c10ab102158f8f9772f8133f916dacb2c9b47fd8ba5f8"} Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.923149 4728 scope.go:117] "RemoveContainer" containerID="f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.923247 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f67ddf84f-r6lmd" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.928662 4728 generic.go:334] "Generic (PLEG): container finished" podID="8caa3c16-a992-4534-b406-bbac9de9baa7" containerID="28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab" exitCode=0 Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.928707 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" event={"ID":"8caa3c16-a992-4534-b406-bbac9de9baa7","Type":"ContainerDied","Data":"28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab"} Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.928736 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" event={"ID":"8caa3c16-a992-4534-b406-bbac9de9baa7","Type":"ContainerDied","Data":"f8e72773733b2c4b9883b23b47f809c3dec613a170937c86531a986910765e0e"} Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.928770 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f85787b9-kmmz7" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.947438 4728 scope.go:117] "RemoveContainer" containerID="9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.963073 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" podStartSLOduration=2.963059523 podStartE2EDuration="2.963059523s" podCreationTimestamp="2026-01-25 05:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:53:14.943636208 +0000 UTC m=+885.979514188" watchObservedRunningTime="2026-01-25 05:53:14.963059523 +0000 UTC m=+885.998937503" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.966506 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f67ddf84f-r6lmd"] Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.969577 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.969600 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.969611 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-schcp\" (UniqueName: \"kubernetes.io/projected/8caa3c16-a992-4534-b406-bbac9de9baa7-kube-api-access-schcp\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.969619 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8caa3c16-a992-4534-b406-bbac9de9baa7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.970186 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f67ddf84f-r6lmd"] Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.983447 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f85787b9-kmmz7"] Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.985061 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f85787b9-kmmz7"] Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.987685 4728 scope.go:117] "RemoveContainer" containerID="f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37" Jan 25 05:53:14 crc kubenswrapper[4728]: E0125 05:53:14.989866 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37\": container with ID starting with f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37 not found: ID does not exist" containerID="f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.989899 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37"} err="failed to get container status \"f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37\": rpc error: code = NotFound desc = could not find container \"f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37\": container with ID starting with f9d37b6d28de6dd7796e6686c67a002b1d7d7105dbd851faf1d34a32379a6f37 not found: ID does not exist" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.989925 4728 scope.go:117] "RemoveContainer" containerID="9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b" Jan 25 05:53:14 crc kubenswrapper[4728]: E0125 05:53:14.994446 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b\": container with ID starting with 9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b not found: ID does not exist" containerID="9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.994489 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b"} err="failed to get container status \"9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b\": rpc error: code = NotFound desc = could not find container \"9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b\": container with ID starting with 9c9e62820e2029585a48013a82051c3c00401586fa0cf9269f1b981a04ed0c5b not found: ID does not exist" Jan 25 05:53:14 crc kubenswrapper[4728]: I0125 05:53:14.994514 4728 scope.go:117] "RemoveContainer" containerID="28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.018625 4728 scope.go:117] "RemoveContainer" containerID="b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.034498 4728 scope.go:117] "RemoveContainer" containerID="28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab" Jan 25 05:53:15 crc kubenswrapper[4728]: E0125 05:53:15.035146 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab\": container with ID starting with 28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab not found: ID does not exist" containerID="28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.035187 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab"} err="failed to get container status \"28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab\": rpc error: code = NotFound desc = could not find container \"28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab\": container with ID starting with 28048d1f65a1960c58c682c298181c1933e128649ef2ce157ebf8ead8c14a7ab not found: ID does not exist" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.035214 4728 scope.go:117] "RemoveContainer" containerID="b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934" Jan 25 05:53:15 crc kubenswrapper[4728]: E0125 05:53:15.035990 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934\": container with ID starting with b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934 not found: ID does not exist" containerID="b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.036029 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934"} err="failed to get container status \"b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934\": rpc error: code = NotFound desc = could not find container \"b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934\": container with ID starting with b953c7a36815488a8bda0ab34661168ba9c803cf0b5ea9d71737a399b76d2934 not found: ID does not exist" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.130776 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.131189 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.140389 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb45645f7-6b8qk"] Jan 25 05:53:15 crc kubenswrapper[4728]: W0125 05:53:15.153850 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf62a2680_ed4b_449b_925c_e243731ea8b4.slice/crio-ba4464f237638a76c94754e8a7dc8662ff2877b2f333e38e5be930f374d69c71 WatchSource:0}: Error finding container ba4464f237638a76c94754e8a7dc8662ff2877b2f333e38e5be930f374d69c71: Status 404 returned error can't find the container with id ba4464f237638a76c94754e8a7dc8662ff2877b2f333e38e5be930f374d69c71 Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.164148 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.317661 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 25 05:53:15 crc kubenswrapper[4728]: E0125 05:53:15.318041 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8caa3c16-a992-4534-b406-bbac9de9baa7" containerName="dnsmasq-dns" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.318062 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8caa3c16-a992-4534-b406-bbac9de9baa7" containerName="dnsmasq-dns" Jan 25 05:53:15 crc kubenswrapper[4728]: E0125 05:53:15.318093 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8caa3c16-a992-4534-b406-bbac9de9baa7" containerName="init" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.318099 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8caa3c16-a992-4534-b406-bbac9de9baa7" containerName="init" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.318259 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8caa3c16-a992-4534-b406-bbac9de9baa7" containerName="dnsmasq-dns" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.319187 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.322048 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-drlbf" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.322234 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.322847 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.323135 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.326802 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.343067 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c25a9b6-42e8-4e9f-8c23-597279ad0d03" path="/var/lib/kubelet/pods/7c25a9b6-42e8-4e9f-8c23-597279ad0d03/volumes" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.343699 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8caa3c16-a992-4534-b406-bbac9de9baa7" path="/var/lib/kubelet/pods/8caa3c16-a992-4534-b406-bbac9de9baa7/volumes" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.440393 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.448338 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.448529 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.452743 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.453603 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.453832 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.454041 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2lpwz" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.479677 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnx5n\" (UniqueName: \"kubernetes.io/projected/fe5324ca-4693-4d57-84e1-b2facac597bc-kube-api-access-mnx5n\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.479728 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.479748 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.479783 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.479808 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe5324ca-4693-4d57-84e1-b2facac597bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.479934 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe5324ca-4693-4d57-84e1-b2facac597bc-config\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.480023 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe5324ca-4693-4d57-84e1-b2facac597bc-scripts\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.582876 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583297 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c74720-9ea2-42cd-93d6-1c17ede15e62-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583378 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583442 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe5324ca-4693-4d57-84e1-b2facac597bc-config\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583569 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe5324ca-4693-4d57-84e1-b2facac597bc-scripts\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583647 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftrzj\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-kube-api-access-ftrzj\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583711 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f3c74720-9ea2-42cd-93d6-1c17ede15e62-lock\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583766 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f3c74720-9ea2-42cd-93d6-1c17ede15e62-cache\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583850 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnx5n\" (UniqueName: \"kubernetes.io/projected/fe5324ca-4693-4d57-84e1-b2facac597bc-kube-api-access-mnx5n\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583910 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.583939 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.584044 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.584101 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe5324ca-4693-4d57-84e1-b2facac597bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.584946 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe5324ca-4693-4d57-84e1-b2facac597bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.585449 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe5324ca-4693-4d57-84e1-b2facac597bc-config\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.585505 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe5324ca-4693-4d57-84e1-b2facac597bc-scripts\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.589266 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.590697 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.591933 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5324ca-4693-4d57-84e1-b2facac597bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.599585 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnx5n\" (UniqueName: \"kubernetes.io/projected/fe5324ca-4693-4d57-84e1-b2facac597bc-kube-api-access-mnx5n\") pod \"ovn-northd-0\" (UID: \"fe5324ca-4693-4d57-84e1-b2facac597bc\") " pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.654805 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bf7"] Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.656296 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.664853 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bf7"] Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.673098 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.685462 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftrzj\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-kube-api-access-ftrzj\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.685511 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f3c74720-9ea2-42cd-93d6-1c17ede15e62-lock\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.685535 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f3c74720-9ea2-42cd-93d6-1c17ede15e62-cache\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.685607 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.685632 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c74720-9ea2-42cd-93d6-1c17ede15e62-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.685653 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: E0125 05:53:15.685829 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 25 05:53:15 crc kubenswrapper[4728]: E0125 05:53:15.685853 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 25 05:53:15 crc kubenswrapper[4728]: E0125 05:53:15.685903 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift podName:f3c74720-9ea2-42cd-93d6-1c17ede15e62 nodeName:}" failed. No retries permitted until 2026-01-25 05:53:16.185881962 +0000 UTC m=+887.221759943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift") pod "swift-storage-0" (UID: "f3c74720-9ea2-42cd-93d6-1c17ede15e62") : configmap "swift-ring-files" not found Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.687024 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f3c74720-9ea2-42cd-93d6-1c17ede15e62-lock\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.687266 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f3c74720-9ea2-42cd-93d6-1c17ede15e62-cache\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.687538 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.704139 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c74720-9ea2-42cd-93d6-1c17ede15e62-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.706937 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftrzj\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-kube-api-access-ftrzj\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.728181 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.787441 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpvn\" (UniqueName: \"kubernetes.io/projected/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-kube-api-access-bjpvn\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.787488 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-utilities\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.787588 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-catalog-content\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.889931 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpvn\" (UniqueName: \"kubernetes.io/projected/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-kube-api-access-bjpvn\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.890254 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-utilities\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.890338 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-catalog-content\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.890742 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-utilities\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.890934 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-catalog-content\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.907683 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpvn\" (UniqueName: \"kubernetes.io/projected/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-kube-api-access-bjpvn\") pod \"redhat-marketplace-x7bf7\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.937637 4728 generic.go:334] "Generic (PLEG): container finished" podID="f62a2680-ed4b-449b-925c-e243731ea8b4" containerID="9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595" exitCode=0 Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.938980 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" event={"ID":"f62a2680-ed4b-449b-925c-e243731ea8b4","Type":"ContainerDied","Data":"9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595"} Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.939006 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" event={"ID":"f62a2680-ed4b-449b-925c-e243731ea8b4","Type":"ContainerStarted","Data":"ba4464f237638a76c94754e8a7dc8662ff2877b2f333e38e5be930f374d69c71"} Jan 25 05:53:15 crc kubenswrapper[4728]: I0125 05:53:15.976901 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.114566 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 25 05:53:16 crc kubenswrapper[4728]: W0125 05:53:16.123233 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe5324ca_4693_4d57_84e1_b2facac597bc.slice/crio-053fea464dcee5734661df91b9e4d522cbef0182949c717e2e8e3fe98db75b8f WatchSource:0}: Error finding container 053fea464dcee5734661df91b9e4d522cbef0182949c717e2e8e3fe98db75b8f: Status 404 returned error can't find the container with id 053fea464dcee5734661df91b9e4d522cbef0182949c717e2e8e3fe98db75b8f Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.195348 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:16 crc kubenswrapper[4728]: E0125 05:53:16.195656 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 25 05:53:16 crc kubenswrapper[4728]: E0125 05:53:16.195689 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 25 05:53:16 crc kubenswrapper[4728]: E0125 05:53:16.195802 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift podName:f3c74720-9ea2-42cd-93d6-1c17ede15e62 nodeName:}" failed. No retries permitted until 2026-01-25 05:53:17.195782121 +0000 UTC m=+888.231660091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift") pod "swift-storage-0" (UID: "f3c74720-9ea2-42cd-93d6-1c17ede15e62") : configmap "swift-ring-files" not found Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.384291 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bf7"] Jan 25 05:53:16 crc kubenswrapper[4728]: W0125 05:53:16.388472 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice/crio-0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c WatchSource:0}: Error finding container 0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c: Status 404 returned error can't find the container with id 0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.950179 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe5324ca-4693-4d57-84e1-b2facac597bc","Type":"ContainerStarted","Data":"053fea464dcee5734661df91b9e4d522cbef0182949c717e2e8e3fe98db75b8f"} Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.952309 4728 generic.go:334] "Generic (PLEG): container finished" podID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerID="dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb" exitCode=0 Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.952434 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bf7" event={"ID":"fb9aaed5-97ec-40d8-97dc-9783fd1c682f","Type":"ContainerDied","Data":"dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb"} Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.952505 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bf7" event={"ID":"fb9aaed5-97ec-40d8-97dc-9783fd1c682f","Type":"ContainerStarted","Data":"0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c"} Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.955739 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" event={"ID":"f62a2680-ed4b-449b-925c-e243731ea8b4","Type":"ContainerStarted","Data":"e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99"} Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.956841 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:16 crc kubenswrapper[4728]: I0125 05:53:16.995689 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" podStartSLOduration=2.995662522 podStartE2EDuration="2.995662522s" podCreationTimestamp="2026-01-25 05:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:53:16.990603723 +0000 UTC m=+888.026481703" watchObservedRunningTime="2026-01-25 05:53:16.995662522 +0000 UTC m=+888.031540492" Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.212303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:17 crc kubenswrapper[4728]: E0125 05:53:17.212509 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 25 05:53:17 crc kubenswrapper[4728]: E0125 05:53:17.212526 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 25 05:53:17 crc kubenswrapper[4728]: E0125 05:53:17.212580 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift podName:f3c74720-9ea2-42cd-93d6-1c17ede15e62 nodeName:}" failed. No retries permitted until 2026-01-25 05:53:19.212564427 +0000 UTC m=+890.248442407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift") pod "swift-storage-0" (UID: "f3c74720-9ea2-42cd-93d6-1c17ede15e62") : configmap "swift-ring-files" not found Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.613865 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.614304 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.657035 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.963892 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe5324ca-4693-4d57-84e1-b2facac597bc","Type":"ContainerStarted","Data":"78c9812c177d4ad5efc64e82e84efc2f2a0601a474a1383c3702f0cd6a11dc96"} Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.964289 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe5324ca-4693-4d57-84e1-b2facac597bc","Type":"ContainerStarted","Data":"41efc06d07fd2e3cb7fe3cebe5a199b09e55942bded7e622ff6febfca650827c"} Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.964305 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.967272 4728 generic.go:334] "Generic (PLEG): container finished" podID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerID="4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29" exitCode=0 Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.967359 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bf7" event={"ID":"fb9aaed5-97ec-40d8-97dc-9783fd1c682f","Type":"ContainerDied","Data":"4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29"} Jan 25 05:53:17 crc kubenswrapper[4728]: I0125 05:53:17.999570 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.576698936 podStartE2EDuration="2.999539955s" podCreationTimestamp="2026-01-25 05:53:15 +0000 UTC" firstStartedPulling="2026-01-25 05:53:16.125345022 +0000 UTC m=+887.161223003" lastFinishedPulling="2026-01-25 05:53:17.548186042 +0000 UTC m=+888.584064022" observedRunningTime="2026-01-25 05:53:17.982253378 +0000 UTC m=+889.018131359" watchObservedRunningTime="2026-01-25 05:53:17.999539955 +0000 UTC m=+889.035417935" Jan 25 05:53:18 crc kubenswrapper[4728]: I0125 05:53:18.936587 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 25 05:53:18 crc kubenswrapper[4728]: I0125 05:53:18.978456 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bf7" event={"ID":"fb9aaed5-97ec-40d8-97dc-9783fd1c682f","Type":"ContainerStarted","Data":"cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a"} Jan 25 05:53:18 crc kubenswrapper[4728]: I0125 05:53:18.995208 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x7bf7" podStartSLOduration=2.557889028 podStartE2EDuration="3.995191035s" podCreationTimestamp="2026-01-25 05:53:15 +0000 UTC" firstStartedPulling="2026-01-25 05:53:16.954851778 +0000 UTC m=+887.990729748" lastFinishedPulling="2026-01-25 05:53:18.392153775 +0000 UTC m=+889.428031755" observedRunningTime="2026-01-25 05:53:18.994839151 +0000 UTC m=+890.030717132" watchObservedRunningTime="2026-01-25 05:53:18.995191035 +0000 UTC m=+890.031069015" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.001614 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.249349 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:19 crc kubenswrapper[4728]: E0125 05:53:19.249581 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 25 05:53:19 crc kubenswrapper[4728]: E0125 05:53:19.249902 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 25 05:53:19 crc kubenswrapper[4728]: E0125 05:53:19.249970 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift podName:f3c74720-9ea2-42cd-93d6-1c17ede15e62 nodeName:}" failed. No retries permitted until 2026-01-25 05:53:23.249949693 +0000 UTC m=+894.285827673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift") pod "swift-storage-0" (UID: "f3c74720-9ea2-42cd-93d6-1c17ede15e62") : configmap "swift-ring-files" not found Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.422016 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7kctt"] Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.423095 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.426174 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.426435 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.426501 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.433476 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7kctt"] Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.447762 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-7kctt"] Jan 25 05:53:19 crc kubenswrapper[4728]: E0125 05:53:19.448451 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-5zvz9 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-5zvz9 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-7kctt" podUID="debabb70-8f8f-4294-9e29-2883e4596f9e" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.495268 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-575nj"] Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.498131 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.506270 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-575nj"] Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.555610 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-swiftconf\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.555701 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-swiftconf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.555850 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxzlf\" (UniqueName: \"kubernetes.io/projected/747ed3cf-861f-46d7-8411-3c3318fbff34-kube-api-access-zxzlf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.555882 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-combined-ca-bundle\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.555946 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/debabb70-8f8f-4294-9e29-2883e4596f9e-etc-swift\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.555995 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-combined-ca-bundle\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.556023 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zvz9\" (UniqueName: \"kubernetes.io/projected/debabb70-8f8f-4294-9e29-2883e4596f9e-kube-api-access-5zvz9\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.556091 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-dispersionconf\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.556174 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-ring-data-devices\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.556231 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-dispersionconf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.556288 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/747ed3cf-861f-46d7-8411-3c3318fbff34-etc-swift\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.556311 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-ring-data-devices\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.556384 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-scripts\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.556405 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-scripts\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.658258 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-dispersionconf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.658876 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/747ed3cf-861f-46d7-8411-3c3318fbff34-etc-swift\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659011 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-ring-data-devices\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659117 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-scripts\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659201 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-scripts\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659333 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-swiftconf\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659466 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-swiftconf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659642 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxzlf\" (UniqueName: \"kubernetes.io/projected/747ed3cf-861f-46d7-8411-3c3318fbff34-kube-api-access-zxzlf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659724 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-combined-ca-bundle\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659814 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/debabb70-8f8f-4294-9e29-2883e4596f9e-etc-swift\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659894 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-combined-ca-bundle\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.659996 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zvz9\" (UniqueName: \"kubernetes.io/projected/debabb70-8f8f-4294-9e29-2883e4596f9e-kube-api-access-5zvz9\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.660048 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-ring-data-devices\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.660078 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-dispersionconf\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.660180 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-ring-data-devices\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.662748 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/747ed3cf-861f-46d7-8411-3c3318fbff34-etc-swift\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.664098 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-ring-data-devices\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.664457 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/debabb70-8f8f-4294-9e29-2883e4596f9e-etc-swift\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.667425 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-scripts\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.667851 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-scripts\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.688714 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-dispersionconf\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.699991 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-combined-ca-bundle\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.704791 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-swiftconf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.705081 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-swiftconf\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.705435 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-dispersionconf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.705522 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-combined-ca-bundle\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.706798 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxzlf\" (UniqueName: \"kubernetes.io/projected/747ed3cf-861f-46d7-8411-3c3318fbff34-kube-api-access-zxzlf\") pod \"swift-ring-rebalance-575nj\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.707865 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zvz9\" (UniqueName: \"kubernetes.io/projected/debabb70-8f8f-4294-9e29-2883e4596f9e-kube-api-access-5zvz9\") pod \"swift-ring-rebalance-7kctt\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.811985 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.989211 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:19 crc kubenswrapper[4728]: I0125 05:53:19.999709 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.067904 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zvz9\" (UniqueName: \"kubernetes.io/projected/debabb70-8f8f-4294-9e29-2883e4596f9e-kube-api-access-5zvz9\") pod \"debabb70-8f8f-4294-9e29-2883e4596f9e\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.067948 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/debabb70-8f8f-4294-9e29-2883e4596f9e-etc-swift\") pod \"debabb70-8f8f-4294-9e29-2883e4596f9e\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.068007 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-ring-data-devices\") pod \"debabb70-8f8f-4294-9e29-2883e4596f9e\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.068148 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-combined-ca-bundle\") pod \"debabb70-8f8f-4294-9e29-2883e4596f9e\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.068198 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-dispersionconf\") pod \"debabb70-8f8f-4294-9e29-2883e4596f9e\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.068215 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-swiftconf\") pod \"debabb70-8f8f-4294-9e29-2883e4596f9e\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.068286 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-scripts\") pod \"debabb70-8f8f-4294-9e29-2883e4596f9e\" (UID: \"debabb70-8f8f-4294-9e29-2883e4596f9e\") " Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.070382 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debabb70-8f8f-4294-9e29-2883e4596f9e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "debabb70-8f8f-4294-9e29-2883e4596f9e" (UID: "debabb70-8f8f-4294-9e29-2883e4596f9e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.070757 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "debabb70-8f8f-4294-9e29-2883e4596f9e" (UID: "debabb70-8f8f-4294-9e29-2883e4596f9e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.070788 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-scripts" (OuterVolumeSpecName: "scripts") pod "debabb70-8f8f-4294-9e29-2883e4596f9e" (UID: "debabb70-8f8f-4294-9e29-2883e4596f9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.072565 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debabb70-8f8f-4294-9e29-2883e4596f9e-kube-api-access-5zvz9" (OuterVolumeSpecName: "kube-api-access-5zvz9") pod "debabb70-8f8f-4294-9e29-2883e4596f9e" (UID: "debabb70-8f8f-4294-9e29-2883e4596f9e"). InnerVolumeSpecName "kube-api-access-5zvz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.074728 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "debabb70-8f8f-4294-9e29-2883e4596f9e" (UID: "debabb70-8f8f-4294-9e29-2883e4596f9e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.078937 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "debabb70-8f8f-4294-9e29-2883e4596f9e" (UID: "debabb70-8f8f-4294-9e29-2883e4596f9e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.079138 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "debabb70-8f8f-4294-9e29-2883e4596f9e" (UID: "debabb70-8f8f-4294-9e29-2883e4596f9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.169667 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.169696 4728 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.169709 4728 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/debabb70-8f8f-4294-9e29-2883e4596f9e-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.169718 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.169727 4728 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/debabb70-8f8f-4294-9e29-2883e4596f9e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.169736 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zvz9\" (UniqueName: \"kubernetes.io/projected/debabb70-8f8f-4294-9e29-2883e4596f9e-kube-api-access-5zvz9\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.169747 4728 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/debabb70-8f8f-4294-9e29-2883e4596f9e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.196742 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-575nj"] Jan 25 05:53:20 crc kubenswrapper[4728]: W0125 05:53:20.202913 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod747ed3cf_861f_46d7_8411_3c3318fbff34.slice/crio-ac5f13e5bc7d021ef3fbb85ccbc765991c58b1467fd41cf4f265d0ec6ac56761 WatchSource:0}: Error finding container ac5f13e5bc7d021ef3fbb85ccbc765991c58b1467fd41cf4f265d0ec6ac56761: Status 404 returned error can't find the container with id ac5f13e5bc7d021ef3fbb85ccbc765991c58b1467fd41cf4f265d0ec6ac56761 Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.997894 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7kctt" Jan 25 05:53:20 crc kubenswrapper[4728]: I0125 05:53:20.998077 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-575nj" event={"ID":"747ed3cf-861f-46d7-8411-3c3318fbff34","Type":"ContainerStarted","Data":"ac5f13e5bc7d021ef3fbb85ccbc765991c58b1467fd41cf4f265d0ec6ac56761"} Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.042695 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-7kctt"] Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.048976 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-7kctt"] Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.060959 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.061096 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.128705 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.305973 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lrwqb"] Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.306968 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.309115 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.312660 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lrwqb"] Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.336693 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debabb70-8f8f-4294-9e29-2883e4596f9e" path="/var/lib/kubelet/pods/debabb70-8f8f-4294-9e29-2883e4596f9e/volumes" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.389763 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-operator-scripts\") pod \"root-account-create-update-lrwqb\" (UID: \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\") " pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.389830 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m5jw\" (UniqueName: \"kubernetes.io/projected/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-kube-api-access-5m5jw\") pod \"root-account-create-update-lrwqb\" (UID: \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\") " pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.491994 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-operator-scripts\") pod \"root-account-create-update-lrwqb\" (UID: \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\") " pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.492069 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m5jw\" (UniqueName: \"kubernetes.io/projected/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-kube-api-access-5m5jw\") pod \"root-account-create-update-lrwqb\" (UID: \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\") " pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.493236 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-operator-scripts\") pod \"root-account-create-update-lrwqb\" (UID: \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\") " pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.509592 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m5jw\" (UniqueName: \"kubernetes.io/projected/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-kube-api-access-5m5jw\") pod \"root-account-create-update-lrwqb\" (UID: \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\") " pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:21 crc kubenswrapper[4728]: I0125 05:53:21.628466 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.018279 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lrwqb"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.085723 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.451928 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-98z6f"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.453249 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.459247 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-98z6f"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.507437 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6294\" (UniqueName: \"kubernetes.io/projected/e8ac425f-438a-4797-8689-86ca94810696-kube-api-access-m6294\") pod \"keystone-db-create-98z6f\" (UID: \"e8ac425f-438a-4797-8689-86ca94810696\") " pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.507542 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac425f-438a-4797-8689-86ca94810696-operator-scripts\") pod \"keystone-db-create-98z6f\" (UID: \"e8ac425f-438a-4797-8689-86ca94810696\") " pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.579652 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-16ae-account-create-update-mglnl"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.580708 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.584075 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.585384 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-16ae-account-create-update-mglnl"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.610580 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6294\" (UniqueName: \"kubernetes.io/projected/e8ac425f-438a-4797-8689-86ca94810696-kube-api-access-m6294\") pod \"keystone-db-create-98z6f\" (UID: \"e8ac425f-438a-4797-8689-86ca94810696\") " pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.610749 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac425f-438a-4797-8689-86ca94810696-operator-scripts\") pod \"keystone-db-create-98z6f\" (UID: \"e8ac425f-438a-4797-8689-86ca94810696\") " pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.611522 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac425f-438a-4797-8689-86ca94810696-operator-scripts\") pod \"keystone-db-create-98z6f\" (UID: \"e8ac425f-438a-4797-8689-86ca94810696\") " pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.634634 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6294\" (UniqueName: \"kubernetes.io/projected/e8ac425f-438a-4797-8689-86ca94810696-kube-api-access-m6294\") pod \"keystone-db-create-98z6f\" (UID: \"e8ac425f-438a-4797-8689-86ca94810696\") " pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.713995 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa32c669-499d-4df6-b58f-0fc9680ac7b2-operator-scripts\") pod \"keystone-16ae-account-create-update-mglnl\" (UID: \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\") " pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.714103 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h98bm\" (UniqueName: \"kubernetes.io/projected/aa32c669-499d-4df6-b58f-0fc9680ac7b2-kube-api-access-h98bm\") pod \"keystone-16ae-account-create-update-mglnl\" (UID: \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\") " pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.741470 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.770342 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.809975 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-q55tq"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.810885 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-q55tq" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.815856 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa32c669-499d-4df6-b58f-0fc9680ac7b2-operator-scripts\") pod \"keystone-16ae-account-create-update-mglnl\" (UID: \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\") " pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.815914 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h98bm\" (UniqueName: \"kubernetes.io/projected/aa32c669-499d-4df6-b58f-0fc9680ac7b2-kube-api-access-h98bm\") pod \"keystone-16ae-account-create-update-mglnl\" (UID: \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\") " pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.817849 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa32c669-499d-4df6-b58f-0fc9680ac7b2-operator-scripts\") pod \"keystone-16ae-account-create-update-mglnl\" (UID: \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\") " pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.838519 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-q55tq"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.852147 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h98bm\" (UniqueName: \"kubernetes.io/projected/aa32c669-499d-4df6-b58f-0fc9680ac7b2-kube-api-access-h98bm\") pod \"keystone-16ae-account-create-update-mglnl\" (UID: \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\") " pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.867040 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5c15-account-create-update-fnc86"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.868441 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.873222 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.878552 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c15-account-create-update-fnc86"] Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.899156 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.917416 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js7gx\" (UniqueName: \"kubernetes.io/projected/68bda3b6-c13a-4843-88be-c192ea6c8777-kube-api-access-js7gx\") pod \"placement-db-create-q55tq\" (UID: \"68bda3b6-c13a-4843-88be-c192ea6c8777\") " pod="openstack/placement-db-create-q55tq" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.917584 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5qf7\" (UniqueName: \"kubernetes.io/projected/b680d31f-ff1c-460a-8937-0f97023ba959-kube-api-access-v5qf7\") pod \"placement-5c15-account-create-update-fnc86\" (UID: \"b680d31f-ff1c-460a-8937-0f97023ba959\") " pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.917677 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68bda3b6-c13a-4843-88be-c192ea6c8777-operator-scripts\") pod \"placement-db-create-q55tq\" (UID: \"68bda3b6-c13a-4843-88be-c192ea6c8777\") " pod="openstack/placement-db-create-q55tq" Jan 25 05:53:22 crc kubenswrapper[4728]: I0125 05:53:22.917727 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b680d31f-ff1c-460a-8937-0f97023ba959-operator-scripts\") pod \"placement-5c15-account-create-update-fnc86\" (UID: \"b680d31f-ff1c-460a-8937-0f97023ba959\") " pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.020696 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5qf7\" (UniqueName: \"kubernetes.io/projected/b680d31f-ff1c-460a-8937-0f97023ba959-kube-api-access-v5qf7\") pod \"placement-5c15-account-create-update-fnc86\" (UID: \"b680d31f-ff1c-460a-8937-0f97023ba959\") " pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.020742 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68bda3b6-c13a-4843-88be-c192ea6c8777-operator-scripts\") pod \"placement-db-create-q55tq\" (UID: \"68bda3b6-c13a-4843-88be-c192ea6c8777\") " pod="openstack/placement-db-create-q55tq" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.020800 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b680d31f-ff1c-460a-8937-0f97023ba959-operator-scripts\") pod \"placement-5c15-account-create-update-fnc86\" (UID: \"b680d31f-ff1c-460a-8937-0f97023ba959\") " pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.020881 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js7gx\" (UniqueName: \"kubernetes.io/projected/68bda3b6-c13a-4843-88be-c192ea6c8777-kube-api-access-js7gx\") pod \"placement-db-create-q55tq\" (UID: \"68bda3b6-c13a-4843-88be-c192ea6c8777\") " pod="openstack/placement-db-create-q55tq" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.021522 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68bda3b6-c13a-4843-88be-c192ea6c8777-operator-scripts\") pod \"placement-db-create-q55tq\" (UID: \"68bda3b6-c13a-4843-88be-c192ea6c8777\") " pod="openstack/placement-db-create-q55tq" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.021844 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b680d31f-ff1c-460a-8937-0f97023ba959-operator-scripts\") pod \"placement-5c15-account-create-update-fnc86\" (UID: \"b680d31f-ff1c-460a-8937-0f97023ba959\") " pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.032075 4728 generic.go:334] "Generic (PLEG): container finished" podID="ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d" containerID="3328c6210345602068e557b5ee3b4c617763769aee701f8b67a4a22a054943a4" exitCode=0 Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.032168 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lrwqb" event={"ID":"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d","Type":"ContainerDied","Data":"3328c6210345602068e557b5ee3b4c617763769aee701f8b67a4a22a054943a4"} Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.032217 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lrwqb" event={"ID":"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d","Type":"ContainerStarted","Data":"b3b9b953f43037b30af3d143beb8a6b5f84bfe271f55f2a7476a77f7c2bc4119"} Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.034347 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8njbn"] Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.037360 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js7gx\" (UniqueName: \"kubernetes.io/projected/68bda3b6-c13a-4843-88be-c192ea6c8777-kube-api-access-js7gx\") pod \"placement-db-create-q55tq\" (UID: \"68bda3b6-c13a-4843-88be-c192ea6c8777\") " pod="openstack/placement-db-create-q55tq" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.039211 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8njbn" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.039906 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5qf7\" (UniqueName: \"kubernetes.io/projected/b680d31f-ff1c-460a-8937-0f97023ba959-kube-api-access-v5qf7\") pod \"placement-5c15-account-create-update-fnc86\" (UID: \"b680d31f-ff1c-460a-8937-0f97023ba959\") " pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.051866 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8njbn"] Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.122816 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c7493c-1258-4bdd-a009-525474fe9aed-operator-scripts\") pod \"glance-db-create-8njbn\" (UID: \"11c7493c-1258-4bdd-a009-525474fe9aed\") " pod="openstack/glance-db-create-8njbn" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.122868 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9mqn\" (UniqueName: \"kubernetes.io/projected/11c7493c-1258-4bdd-a009-525474fe9aed-kube-api-access-w9mqn\") pod \"glance-db-create-8njbn\" (UID: \"11c7493c-1258-4bdd-a009-525474fe9aed\") " pod="openstack/glance-db-create-8njbn" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.159699 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-q55tq" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.180799 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8e95-account-create-update-s4qm5"] Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.182229 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.184075 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.188227 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.196405 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8e95-account-create-update-s4qm5"] Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.224801 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9mqn\" (UniqueName: \"kubernetes.io/projected/11c7493c-1258-4bdd-a009-525474fe9aed-kube-api-access-w9mqn\") pod \"glance-db-create-8njbn\" (UID: \"11c7493c-1258-4bdd-a009-525474fe9aed\") " pod="openstack/glance-db-create-8njbn" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.224864 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec75869-ff42-4ec7-b69b-da9d72fe052a-operator-scripts\") pod \"glance-8e95-account-create-update-s4qm5\" (UID: \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\") " pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.224930 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrbxc\" (UniqueName: \"kubernetes.io/projected/5ec75869-ff42-4ec7-b69b-da9d72fe052a-kube-api-access-rrbxc\") pod \"glance-8e95-account-create-update-s4qm5\" (UID: \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\") " pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.225211 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c7493c-1258-4bdd-a009-525474fe9aed-operator-scripts\") pod \"glance-db-create-8njbn\" (UID: \"11c7493c-1258-4bdd-a009-525474fe9aed\") " pod="openstack/glance-db-create-8njbn" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.226079 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c7493c-1258-4bdd-a009-525474fe9aed-operator-scripts\") pod \"glance-db-create-8njbn\" (UID: \"11c7493c-1258-4bdd-a009-525474fe9aed\") " pod="openstack/glance-db-create-8njbn" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.238906 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9mqn\" (UniqueName: \"kubernetes.io/projected/11c7493c-1258-4bdd-a009-525474fe9aed-kube-api-access-w9mqn\") pod \"glance-db-create-8njbn\" (UID: \"11c7493c-1258-4bdd-a009-525474fe9aed\") " pod="openstack/glance-db-create-8njbn" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.327260 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec75869-ff42-4ec7-b69b-da9d72fe052a-operator-scripts\") pod \"glance-8e95-account-create-update-s4qm5\" (UID: \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\") " pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.327314 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.327426 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrbxc\" (UniqueName: \"kubernetes.io/projected/5ec75869-ff42-4ec7-b69b-da9d72fe052a-kube-api-access-rrbxc\") pod \"glance-8e95-account-create-update-s4qm5\" (UID: \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\") " pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:23 crc kubenswrapper[4728]: E0125 05:53:23.327685 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 25 05:53:23 crc kubenswrapper[4728]: E0125 05:53:23.327714 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 25 05:53:23 crc kubenswrapper[4728]: E0125 05:53:23.327776 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift podName:f3c74720-9ea2-42cd-93d6-1c17ede15e62 nodeName:}" failed. No retries permitted until 2026-01-25 05:53:31.327755901 +0000 UTC m=+902.363633880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift") pod "swift-storage-0" (UID: "f3c74720-9ea2-42cd-93d6-1c17ede15e62") : configmap "swift-ring-files" not found Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.328067 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec75869-ff42-4ec7-b69b-da9d72fe052a-operator-scripts\") pod \"glance-8e95-account-create-update-s4qm5\" (UID: \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\") " pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.342904 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrbxc\" (UniqueName: \"kubernetes.io/projected/5ec75869-ff42-4ec7-b69b-da9d72fe052a-kube-api-access-rrbxc\") pod \"glance-8e95-account-create-update-s4qm5\" (UID: \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\") " pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.397023 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8njbn" Jan 25 05:53:23 crc kubenswrapper[4728]: I0125 05:53:23.502834 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:24 crc kubenswrapper[4728]: I0125 05:53:24.737151 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:24 crc kubenswrapper[4728]: I0125 05:53:24.785585 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7687f56cdc-9s4cq"] Jan 25 05:53:24 crc kubenswrapper[4728]: I0125 05:53:24.785771 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" podUID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" containerName="dnsmasq-dns" containerID="cri-o://21c8a9a55ff5f9701a8aac301599c839b9f46f54219f2c1f7e2595f19a047900" gracePeriod=10 Jan 25 05:53:24 crc kubenswrapper[4728]: I0125 05:53:24.918451 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:24 crc kubenswrapper[4728]: I0125 05:53:24.975496 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m5jw\" (UniqueName: \"kubernetes.io/projected/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-kube-api-access-5m5jw\") pod \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\" (UID: \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\") " Jan 25 05:53:24 crc kubenswrapper[4728]: I0125 05:53:24.975785 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-operator-scripts\") pod \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\" (UID: \"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d\") " Jan 25 05:53:24 crc kubenswrapper[4728]: I0125 05:53:24.978763 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d" (UID: "ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:24 crc kubenswrapper[4728]: I0125 05:53:24.983583 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-kube-api-access-5m5jw" (OuterVolumeSpecName: "kube-api-access-5m5jw") pod "ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d" (UID: "ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d"). InnerVolumeSpecName "kube-api-access-5m5jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.054851 4728 generic.go:334] "Generic (PLEG): container finished" podID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" containerID="21c8a9a55ff5f9701a8aac301599c839b9f46f54219f2c1f7e2595f19a047900" exitCode=0 Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.054910 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" event={"ID":"85b49ae9-5410-4a73-a256-ea55dd3d1bfe","Type":"ContainerDied","Data":"21c8a9a55ff5f9701a8aac301599c839b9f46f54219f2c1f7e2595f19a047900"} Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.059919 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lrwqb" event={"ID":"ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d","Type":"ContainerDied","Data":"b3b9b953f43037b30af3d143beb8a6b5f84bfe271f55f2a7476a77f7c2bc4119"} Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.059962 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lrwqb" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.059978 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b9b953f43037b30af3d143beb8a6b5f84bfe271f55f2a7476a77f7c2bc4119" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.078913 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.078944 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m5jw\" (UniqueName: \"kubernetes.io/projected/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d-kube-api-access-5m5jw\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.251448 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.268819 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c15-account-create-update-fnc86"] Jan 25 05:53:25 crc kubenswrapper[4728]: W0125 05:53:25.274953 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb680d31f_ff1c_460a_8937_0f97023ba959.slice/crio-93aac434b400583a8975279167f25b832a48c85895fd73fd7e18c69ce83c3291 WatchSource:0}: Error finding container 93aac434b400583a8975279167f25b832a48c85895fd73fd7e18c69ce83c3291: Status 404 returned error can't find the container with id 93aac434b400583a8975279167f25b832a48c85895fd73fd7e18c69ce83c3291 Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.385781 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmdsq\" (UniqueName: \"kubernetes.io/projected/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-kube-api-access-nmdsq\") pod \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.386169 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-sb\") pod \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.386398 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-config\") pod \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.386439 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-nb\") pod \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.386487 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-dns-svc\") pod \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\" (UID: \"85b49ae9-5410-4a73-a256-ea55dd3d1bfe\") " Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.399023 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-kube-api-access-nmdsq" (OuterVolumeSpecName: "kube-api-access-nmdsq") pod "85b49ae9-5410-4a73-a256-ea55dd3d1bfe" (UID: "85b49ae9-5410-4a73-a256-ea55dd3d1bfe"). InnerVolumeSpecName "kube-api-access-nmdsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.425930 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-q55tq"] Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.433622 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-16ae-account-create-update-mglnl"] Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.439407 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8njbn"] Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.444783 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-98z6f"] Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.450007 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8e95-account-create-update-s4qm5"] Jan 25 05:53:25 crc kubenswrapper[4728]: W0125 05:53:25.452369 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec75869_ff42_4ec7_b69b_da9d72fe052a.slice/crio-dbe05b97c3c91564bd94fe9c2f77e3e61da24d3f8e71d823dc1579d9e31c5afe WatchSource:0}: Error finding container dbe05b97c3c91564bd94fe9c2f77e3e61da24d3f8e71d823dc1579d9e31c5afe: Status 404 returned error can't find the container with id dbe05b97c3c91564bd94fe9c2f77e3e61da24d3f8e71d823dc1579d9e31c5afe Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.481936 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "85b49ae9-5410-4a73-a256-ea55dd3d1bfe" (UID: "85b49ae9-5410-4a73-a256-ea55dd3d1bfe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.482633 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-config" (OuterVolumeSpecName: "config") pod "85b49ae9-5410-4a73-a256-ea55dd3d1bfe" (UID: "85b49ae9-5410-4a73-a256-ea55dd3d1bfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.483359 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "85b49ae9-5410-4a73-a256-ea55dd3d1bfe" (UID: "85b49ae9-5410-4a73-a256-ea55dd3d1bfe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.487271 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85b49ae9-5410-4a73-a256-ea55dd3d1bfe" (UID: "85b49ae9-5410-4a73-a256-ea55dd3d1bfe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.488536 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.488559 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.488568 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.488580 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.488590 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmdsq\" (UniqueName: \"kubernetes.io/projected/85b49ae9-5410-4a73-a256-ea55dd3d1bfe-kube-api-access-nmdsq\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.977618 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:25 crc kubenswrapper[4728]: I0125 05:53:25.978682 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.024372 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.067700 4728 generic.go:334] "Generic (PLEG): container finished" podID="5ec75869-ff42-4ec7-b69b-da9d72fe052a" containerID="add62ed99e48fda7ce68c3d4cd1477e0e56f164479f6d0f212ca761517ea8a8a" exitCode=0 Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.067828 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8e95-account-create-update-s4qm5" event={"ID":"5ec75869-ff42-4ec7-b69b-da9d72fe052a","Type":"ContainerDied","Data":"add62ed99e48fda7ce68c3d4cd1477e0e56f164479f6d0f212ca761517ea8a8a"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.067957 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8e95-account-create-update-s4qm5" event={"ID":"5ec75869-ff42-4ec7-b69b-da9d72fe052a","Type":"ContainerStarted","Data":"dbe05b97c3c91564bd94fe9c2f77e3e61da24d3f8e71d823dc1579d9e31c5afe"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.069246 4728 generic.go:334] "Generic (PLEG): container finished" podID="aa32c669-499d-4df6-b58f-0fc9680ac7b2" containerID="160886ff4fbc54544041622e98bd3d08f4780e8146efe633ec126f29c2e4e724" exitCode=0 Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.069294 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16ae-account-create-update-mglnl" event={"ID":"aa32c669-499d-4df6-b58f-0fc9680ac7b2","Type":"ContainerDied","Data":"160886ff4fbc54544041622e98bd3d08f4780e8146efe633ec126f29c2e4e724"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.069450 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16ae-account-create-update-mglnl" event={"ID":"aa32c669-499d-4df6-b58f-0fc9680ac7b2","Type":"ContainerStarted","Data":"838ec174fd0f6ac4b0a60c460756a98a0ceee61d4234b30a91f2b4d64846d795"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.071090 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.071086 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7687f56cdc-9s4cq" event={"ID":"85b49ae9-5410-4a73-a256-ea55dd3d1bfe","Type":"ContainerDied","Data":"28cedffc06a704f12a04cc1cca89e14dafba9414a5a7074a66ade9ad46231fb2"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.071209 4728 scope.go:117] "RemoveContainer" containerID="21c8a9a55ff5f9701a8aac301599c839b9f46f54219f2c1f7e2595f19a047900" Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.072524 4728 generic.go:334] "Generic (PLEG): container finished" podID="11c7493c-1258-4bdd-a009-525474fe9aed" containerID="290c131b92248c1600293b5d3a084e826fa55d32e2142e1bb68b9f56e37d7cad" exitCode=0 Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.072577 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8njbn" event={"ID":"11c7493c-1258-4bdd-a009-525474fe9aed","Type":"ContainerDied","Data":"290c131b92248c1600293b5d3a084e826fa55d32e2142e1bb68b9f56e37d7cad"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.072593 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8njbn" event={"ID":"11c7493c-1258-4bdd-a009-525474fe9aed","Type":"ContainerStarted","Data":"37a3bdfc775d1f98d08f59867aed554b0e9929d0feab8632254977c80db100d8"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.075937 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-575nj" event={"ID":"747ed3cf-861f-46d7-8411-3c3318fbff34","Type":"ContainerStarted","Data":"4fc3b8c2499c9361965e470f0a3e35ca515d24994ae4c5844cbaa0e40f3fa638"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.077522 4728 generic.go:334] "Generic (PLEG): container finished" podID="68bda3b6-c13a-4843-88be-c192ea6c8777" containerID="6cd0b205a90e0434075ee8aedb18d7ff40f87510d38eeadfee691fef8a419ddf" exitCode=0 Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.077619 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-q55tq" event={"ID":"68bda3b6-c13a-4843-88be-c192ea6c8777","Type":"ContainerDied","Data":"6cd0b205a90e0434075ee8aedb18d7ff40f87510d38eeadfee691fef8a419ddf"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.077657 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-q55tq" event={"ID":"68bda3b6-c13a-4843-88be-c192ea6c8777","Type":"ContainerStarted","Data":"e263aba087b263673bc107423ebe90412c36fa2f53eb143c58b59ad600630b33"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.080968 4728 generic.go:334] "Generic (PLEG): container finished" podID="e8ac425f-438a-4797-8689-86ca94810696" containerID="33affe8305d05049193d8c76af3dd0ff87bfedc43a8ef45804f1ce64afde2817" exitCode=0 Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.081053 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-98z6f" event={"ID":"e8ac425f-438a-4797-8689-86ca94810696","Type":"ContainerDied","Data":"33affe8305d05049193d8c76af3dd0ff87bfedc43a8ef45804f1ce64afde2817"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.082055 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-98z6f" event={"ID":"e8ac425f-438a-4797-8689-86ca94810696","Type":"ContainerStarted","Data":"770566ede622f99d06ed49a59d90ba9b5514aa36d33d0efc14f52b22513d5cf0"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.082220 4728 generic.go:334] "Generic (PLEG): container finished" podID="b680d31f-ff1c-460a-8937-0f97023ba959" containerID="3aed13db85c4bbd8d8fb9403d7cff155a0a52dd9df4a1b441ae146e03519eef2" exitCode=0 Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.082247 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c15-account-create-update-fnc86" event={"ID":"b680d31f-ff1c-460a-8937-0f97023ba959","Type":"ContainerDied","Data":"3aed13db85c4bbd8d8fb9403d7cff155a0a52dd9df4a1b441ae146e03519eef2"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.082272 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c15-account-create-update-fnc86" event={"ID":"b680d31f-ff1c-460a-8937-0f97023ba959","Type":"ContainerStarted","Data":"93aac434b400583a8975279167f25b832a48c85895fd73fd7e18c69ce83c3291"} Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.091829 4728 scope.go:117] "RemoveContainer" containerID="0eac5d66783250a895aca1e0f0d00fdbc69f4b5141d9266103b50770240e5770" Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.096737 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-575nj" podStartSLOduration=2.438344857 podStartE2EDuration="7.096701833s" podCreationTimestamp="2026-01-25 05:53:19 +0000 UTC" firstStartedPulling="2026-01-25 05:53:20.205155068 +0000 UTC m=+891.241033038" lastFinishedPulling="2026-01-25 05:53:24.863512035 +0000 UTC m=+895.899390014" observedRunningTime="2026-01-25 05:53:26.092112349 +0000 UTC m=+897.127990328" watchObservedRunningTime="2026-01-25 05:53:26.096701833 +0000 UTC m=+897.132579813" Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.124601 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.182671 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7687f56cdc-9s4cq"] Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.191336 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7687f56cdc-9s4cq"] Jan 25 05:53:26 crc kubenswrapper[4728]: I0125 05:53:26.256867 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bf7"] Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.338473 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" path="/var/lib/kubelet/pods/85b49ae9-5410-4a73-a256-ea55dd3d1bfe/volumes" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.392862 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.524189 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac425f-438a-4797-8689-86ca94810696-operator-scripts\") pod \"e8ac425f-438a-4797-8689-86ca94810696\" (UID: \"e8ac425f-438a-4797-8689-86ca94810696\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.524315 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6294\" (UniqueName: \"kubernetes.io/projected/e8ac425f-438a-4797-8689-86ca94810696-kube-api-access-m6294\") pod \"e8ac425f-438a-4797-8689-86ca94810696\" (UID: \"e8ac425f-438a-4797-8689-86ca94810696\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.524757 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ac425f-438a-4797-8689-86ca94810696-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8ac425f-438a-4797-8689-86ca94810696" (UID: "e8ac425f-438a-4797-8689-86ca94810696"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.524946 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac425f-438a-4797-8689-86ca94810696-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.536628 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ac425f-438a-4797-8689-86ca94810696-kube-api-access-m6294" (OuterVolumeSpecName: "kube-api-access-m6294") pod "e8ac425f-438a-4797-8689-86ca94810696" (UID: "e8ac425f-438a-4797-8689-86ca94810696"). InnerVolumeSpecName "kube-api-access-m6294". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.579355 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8njbn" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.583611 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-q55tq" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.590260 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.600734 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.607001 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.626020 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrbxc\" (UniqueName: \"kubernetes.io/projected/5ec75869-ff42-4ec7-b69b-da9d72fe052a-kube-api-access-rrbxc\") pod \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\" (UID: \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.626115 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js7gx\" (UniqueName: \"kubernetes.io/projected/68bda3b6-c13a-4843-88be-c192ea6c8777-kube-api-access-js7gx\") pod \"68bda3b6-c13a-4843-88be-c192ea6c8777\" (UID: \"68bda3b6-c13a-4843-88be-c192ea6c8777\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.626296 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9mqn\" (UniqueName: \"kubernetes.io/projected/11c7493c-1258-4bdd-a009-525474fe9aed-kube-api-access-w9mqn\") pod \"11c7493c-1258-4bdd-a009-525474fe9aed\" (UID: \"11c7493c-1258-4bdd-a009-525474fe9aed\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.626361 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c7493c-1258-4bdd-a009-525474fe9aed-operator-scripts\") pod \"11c7493c-1258-4bdd-a009-525474fe9aed\" (UID: \"11c7493c-1258-4bdd-a009-525474fe9aed\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.626473 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68bda3b6-c13a-4843-88be-c192ea6c8777-operator-scripts\") pod \"68bda3b6-c13a-4843-88be-c192ea6c8777\" (UID: \"68bda3b6-c13a-4843-88be-c192ea6c8777\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.626514 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec75869-ff42-4ec7-b69b-da9d72fe052a-operator-scripts\") pod \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\" (UID: \"5ec75869-ff42-4ec7-b69b-da9d72fe052a\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.626976 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6294\" (UniqueName: \"kubernetes.io/projected/e8ac425f-438a-4797-8689-86ca94810696-kube-api-access-m6294\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.627211 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68bda3b6-c13a-4843-88be-c192ea6c8777-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68bda3b6-c13a-4843-88be-c192ea6c8777" (UID: "68bda3b6-c13a-4843-88be-c192ea6c8777"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.627275 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec75869-ff42-4ec7-b69b-da9d72fe052a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ec75869-ff42-4ec7-b69b-da9d72fe052a" (UID: "5ec75869-ff42-4ec7-b69b-da9d72fe052a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.627609 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c7493c-1258-4bdd-a009-525474fe9aed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11c7493c-1258-4bdd-a009-525474fe9aed" (UID: "11c7493c-1258-4bdd-a009-525474fe9aed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.630116 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec75869-ff42-4ec7-b69b-da9d72fe052a-kube-api-access-rrbxc" (OuterVolumeSpecName: "kube-api-access-rrbxc") pod "5ec75869-ff42-4ec7-b69b-da9d72fe052a" (UID: "5ec75869-ff42-4ec7-b69b-da9d72fe052a"). InnerVolumeSpecName "kube-api-access-rrbxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.634486 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c7493c-1258-4bdd-a009-525474fe9aed-kube-api-access-w9mqn" (OuterVolumeSpecName: "kube-api-access-w9mqn") pod "11c7493c-1258-4bdd-a009-525474fe9aed" (UID: "11c7493c-1258-4bdd-a009-525474fe9aed"). InnerVolumeSpecName "kube-api-access-w9mqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.634557 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68bda3b6-c13a-4843-88be-c192ea6c8777-kube-api-access-js7gx" (OuterVolumeSpecName: "kube-api-access-js7gx") pod "68bda3b6-c13a-4843-88be-c192ea6c8777" (UID: "68bda3b6-c13a-4843-88be-c192ea6c8777"). InnerVolumeSpecName "kube-api-access-js7gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.662066 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.728276 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b680d31f-ff1c-460a-8937-0f97023ba959-operator-scripts\") pod \"b680d31f-ff1c-460a-8937-0f97023ba959\" (UID: \"b680d31f-ff1c-460a-8937-0f97023ba959\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.728478 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h98bm\" (UniqueName: \"kubernetes.io/projected/aa32c669-499d-4df6-b58f-0fc9680ac7b2-kube-api-access-h98bm\") pod \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\" (UID: \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.728693 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5qf7\" (UniqueName: \"kubernetes.io/projected/b680d31f-ff1c-460a-8937-0f97023ba959-kube-api-access-v5qf7\") pod \"b680d31f-ff1c-460a-8937-0f97023ba959\" (UID: \"b680d31f-ff1c-460a-8937-0f97023ba959\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.728814 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa32c669-499d-4df6-b58f-0fc9680ac7b2-operator-scripts\") pod \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\" (UID: \"aa32c669-499d-4df6-b58f-0fc9680ac7b2\") " Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.729456 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68bda3b6-c13a-4843-88be-c192ea6c8777-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.729482 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec75869-ff42-4ec7-b69b-da9d72fe052a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.729498 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrbxc\" (UniqueName: \"kubernetes.io/projected/5ec75869-ff42-4ec7-b69b-da9d72fe052a-kube-api-access-rrbxc\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.729513 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js7gx\" (UniqueName: \"kubernetes.io/projected/68bda3b6-c13a-4843-88be-c192ea6c8777-kube-api-access-js7gx\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.729529 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9mqn\" (UniqueName: \"kubernetes.io/projected/11c7493c-1258-4bdd-a009-525474fe9aed-kube-api-access-w9mqn\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.729540 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c7493c-1258-4bdd-a009-525474fe9aed-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.733661 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa32c669-499d-4df6-b58f-0fc9680ac7b2-kube-api-access-h98bm" (OuterVolumeSpecName: "kube-api-access-h98bm") pod "aa32c669-499d-4df6-b58f-0fc9680ac7b2" (UID: "aa32c669-499d-4df6-b58f-0fc9680ac7b2"). InnerVolumeSpecName "kube-api-access-h98bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.734422 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b680d31f-ff1c-460a-8937-0f97023ba959-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b680d31f-ff1c-460a-8937-0f97023ba959" (UID: "b680d31f-ff1c-460a-8937-0f97023ba959"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.735455 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b680d31f-ff1c-460a-8937-0f97023ba959-kube-api-access-v5qf7" (OuterVolumeSpecName: "kube-api-access-v5qf7") pod "b680d31f-ff1c-460a-8937-0f97023ba959" (UID: "b680d31f-ff1c-460a-8937-0f97023ba959"). InnerVolumeSpecName "kube-api-access-v5qf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.735493 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa32c669-499d-4df6-b58f-0fc9680ac7b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa32c669-499d-4df6-b58f-0fc9680ac7b2" (UID: "aa32c669-499d-4df6-b58f-0fc9680ac7b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.831562 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5qf7\" (UniqueName: \"kubernetes.io/projected/b680d31f-ff1c-460a-8937-0f97023ba959-kube-api-access-v5qf7\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.831593 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa32c669-499d-4df6-b58f-0fc9680ac7b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.831603 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b680d31f-ff1c-460a-8937-0f97023ba959-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:27 crc kubenswrapper[4728]: I0125 05:53:27.831613 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h98bm\" (UniqueName: \"kubernetes.io/projected/aa32c669-499d-4df6-b58f-0fc9680ac7b2-kube-api-access-h98bm\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.106271 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-q55tq" event={"ID":"68bda3b6-c13a-4843-88be-c192ea6c8777","Type":"ContainerDied","Data":"e263aba087b263673bc107423ebe90412c36fa2f53eb143c58b59ad600630b33"} Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.106339 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e263aba087b263673bc107423ebe90412c36fa2f53eb143c58b59ad600630b33" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.106421 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-q55tq" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.109012 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-98z6f" event={"ID":"e8ac425f-438a-4797-8689-86ca94810696","Type":"ContainerDied","Data":"770566ede622f99d06ed49a59d90ba9b5514aa36d33d0efc14f52b22513d5cf0"} Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.109063 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="770566ede622f99d06ed49a59d90ba9b5514aa36d33d0efc14f52b22513d5cf0" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.109131 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-98z6f" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.120671 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c15-account-create-update-fnc86" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.120668 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c15-account-create-update-fnc86" event={"ID":"b680d31f-ff1c-460a-8937-0f97023ba959","Type":"ContainerDied","Data":"93aac434b400583a8975279167f25b832a48c85895fd73fd7e18c69ce83c3291"} Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.120780 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93aac434b400583a8975279167f25b832a48c85895fd73fd7e18c69ce83c3291" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.128118 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8e95-account-create-update-s4qm5" event={"ID":"5ec75869-ff42-4ec7-b69b-da9d72fe052a","Type":"ContainerDied","Data":"dbe05b97c3c91564bd94fe9c2f77e3e61da24d3f8e71d823dc1579d9e31c5afe"} Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.128143 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe05b97c3c91564bd94fe9c2f77e3e61da24d3f8e71d823dc1579d9e31c5afe" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.128182 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8e95-account-create-update-s4qm5" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.129827 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16ae-account-create-update-mglnl" event={"ID":"aa32c669-499d-4df6-b58f-0fc9680ac7b2","Type":"ContainerDied","Data":"838ec174fd0f6ac4b0a60c460756a98a0ceee61d4234b30a91f2b4d64846d795"} Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.129858 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="838ec174fd0f6ac4b0a60c460756a98a0ceee61d4234b30a91f2b4d64846d795" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.129908 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16ae-account-create-update-mglnl" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.136775 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8njbn" event={"ID":"11c7493c-1258-4bdd-a009-525474fe9aed","Type":"ContainerDied","Data":"37a3bdfc775d1f98d08f59867aed554b0e9929d0feab8632254977c80db100d8"} Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.136815 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37a3bdfc775d1f98d08f59867aed554b0e9929d0feab8632254977c80db100d8" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.136939 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x7bf7" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerName="registry-server" containerID="cri-o://cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a" gracePeriod=2 Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.137021 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8njbn" Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.683980 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6g4g"] Jan 25 05:53:28 crc kubenswrapper[4728]: I0125 05:53:28.685225 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6g4g" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" containerName="registry-server" containerID="cri-o://2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864" gracePeriod=2 Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.064019 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.121431 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.150944 4728 generic.go:334] "Generic (PLEG): container finished" podID="f132cd80-c760-445f-b6bf-41d35700b35c" containerID="2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864" exitCode=0 Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.151005 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6g4g" event={"ID":"f132cd80-c760-445f-b6bf-41d35700b35c","Type":"ContainerDied","Data":"2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864"} Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.151031 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6g4g" event={"ID":"f132cd80-c760-445f-b6bf-41d35700b35c","Type":"ContainerDied","Data":"30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8"} Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.151049 4728 scope.go:117] "RemoveContainer" containerID="2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.151142 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6g4g" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.155160 4728 generic.go:334] "Generic (PLEG): container finished" podID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerID="cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a" exitCode=0 Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.155184 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bf7" event={"ID":"fb9aaed5-97ec-40d8-97dc-9783fd1c682f","Type":"ContainerDied","Data":"cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a"} Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.155198 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bf7" event={"ID":"fb9aaed5-97ec-40d8-97dc-9783fd1c682f","Type":"ContainerDied","Data":"0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c"} Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.155238 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bf7" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.159370 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-utilities\") pod \"f132cd80-c760-445f-b6bf-41d35700b35c\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.159498 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-utilities\") pod \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.159546 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-catalog-content\") pod \"f132cd80-c760-445f-b6bf-41d35700b35c\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.159712 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-catalog-content\") pod \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.159821 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l77lg\" (UniqueName: \"kubernetes.io/projected/f132cd80-c760-445f-b6bf-41d35700b35c-kube-api-access-l77lg\") pod \"f132cd80-c760-445f-b6bf-41d35700b35c\" (UID: \"f132cd80-c760-445f-b6bf-41d35700b35c\") " Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.159956 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjpvn\" (UniqueName: \"kubernetes.io/projected/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-kube-api-access-bjpvn\") pod \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\" (UID: \"fb9aaed5-97ec-40d8-97dc-9783fd1c682f\") " Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.160278 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-utilities" (OuterVolumeSpecName: "utilities") pod "f132cd80-c760-445f-b6bf-41d35700b35c" (UID: "f132cd80-c760-445f-b6bf-41d35700b35c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.160405 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-utilities" (OuterVolumeSpecName: "utilities") pod "fb9aaed5-97ec-40d8-97dc-9783fd1c682f" (UID: "fb9aaed5-97ec-40d8-97dc-9783fd1c682f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.160777 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.160801 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.165494 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-kube-api-access-bjpvn" (OuterVolumeSpecName: "kube-api-access-bjpvn") pod "fb9aaed5-97ec-40d8-97dc-9783fd1c682f" (UID: "fb9aaed5-97ec-40d8-97dc-9783fd1c682f"). InnerVolumeSpecName "kube-api-access-bjpvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.168025 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f132cd80-c760-445f-b6bf-41d35700b35c-kube-api-access-l77lg" (OuterVolumeSpecName: "kube-api-access-l77lg") pod "f132cd80-c760-445f-b6bf-41d35700b35c" (UID: "f132cd80-c760-445f-b6bf-41d35700b35c"). InnerVolumeSpecName "kube-api-access-l77lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.171248 4728 scope.go:117] "RemoveContainer" containerID="6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.176206 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb9aaed5-97ec-40d8-97dc-9783fd1c682f" (UID: "fb9aaed5-97ec-40d8-97dc-9783fd1c682f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.187309 4728 scope.go:117] "RemoveContainer" containerID="696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.203858 4728 scope.go:117] "RemoveContainer" containerID="2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.204191 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864\": container with ID starting with 2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864 not found: ID does not exist" containerID="2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.204252 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864"} err="failed to get container status \"2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864\": rpc error: code = NotFound desc = could not find container \"2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864\": container with ID starting with 2451af90a98df253607dafe4711b021ba086826733e42da385450d7243e22864 not found: ID does not exist" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.204307 4728 scope.go:117] "RemoveContainer" containerID="6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.204683 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7\": container with ID starting with 6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7 not found: ID does not exist" containerID="6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.204720 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7"} err="failed to get container status \"6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7\": rpc error: code = NotFound desc = could not find container \"6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7\": container with ID starting with 6ad0869efda17366d261b9634b252f549b223c7f52fe0fdfabbb1cb599d3aca7 not found: ID does not exist" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.204748 4728 scope.go:117] "RemoveContainer" containerID="696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.205058 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e\": container with ID starting with 696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e not found: ID does not exist" containerID="696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.205128 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e"} err="failed to get container status \"696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e\": rpc error: code = NotFound desc = could not find container \"696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e\": container with ID starting with 696a97fac5de1c807ecf573b67a24492cd4c3590b45da9bfbf77d93e72e92e9e not found: ID does not exist" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.205181 4728 scope.go:117] "RemoveContainer" containerID="cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.205853 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f132cd80-c760-445f-b6bf-41d35700b35c" (UID: "f132cd80-c760-445f-b6bf-41d35700b35c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.223262 4728 scope.go:117] "RemoveContainer" containerID="4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.244280 4728 scope.go:117] "RemoveContainer" containerID="dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.264128 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f132cd80-c760-445f-b6bf-41d35700b35c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.264161 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.264173 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l77lg\" (UniqueName: \"kubernetes.io/projected/f132cd80-c760-445f-b6bf-41d35700b35c-kube-api-access-l77lg\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.264188 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjpvn\" (UniqueName: \"kubernetes.io/projected/fb9aaed5-97ec-40d8-97dc-9783fd1c682f-kube-api-access-bjpvn\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.270441 4728 scope.go:117] "RemoveContainer" containerID="cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.270777 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a\": container with ID starting with cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a not found: ID does not exist" containerID="cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.270805 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a"} err="failed to get container status \"cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a\": rpc error: code = NotFound desc = could not find container \"cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a\": container with ID starting with cf294dc6a9daa4e16c82da4c14824eece0f913784bfd0d17a08faa09ac07692a not found: ID does not exist" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.270829 4728 scope.go:117] "RemoveContainer" containerID="4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.271692 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29\": container with ID starting with 4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29 not found: ID does not exist" containerID="4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.271719 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29"} err="failed to get container status \"4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29\": rpc error: code = NotFound desc = could not find container \"4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29\": container with ID starting with 4de689b4dc29aeff8c85192f60e7ed65042ccae42e6a5b898be748053c325f29 not found: ID does not exist" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.271735 4728 scope.go:117] "RemoveContainer" containerID="dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.272141 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb\": container with ID starting with dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb not found: ID does not exist" containerID="dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.272165 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb"} err="failed to get container status \"dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb\": rpc error: code = NotFound desc = could not find container \"dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb\": container with ID starting with dcb64efff3c246bbf49d89d3d44623443c4a1d277c6e69616fc0b763107ca4fb not found: ID does not exist" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.478200 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bf7"] Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.495082 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bf7"] Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.503678 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6g4g"] Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.511761 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6g4g"] Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.733931 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lrwqb"] Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.743770 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lrwqb"] Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.825718 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7r9b2"] Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826671 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerName="extract-content" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826708 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerName="extract-content" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826732 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" containerName="registry-server" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826739 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" containerName="registry-server" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826753 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec75869-ff42-4ec7-b69b-da9d72fe052a" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826762 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec75869-ff42-4ec7-b69b-da9d72fe052a" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826780 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" containerName="dnsmasq-dns" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826789 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" containerName="dnsmasq-dns" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826804 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerName="registry-server" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826812 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerName="registry-server" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826829 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" containerName="init" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826837 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" containerName="init" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826855 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerName="extract-utilities" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826862 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerName="extract-utilities" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826873 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b680d31f-ff1c-460a-8937-0f97023ba959" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826881 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b680d31f-ff1c-460a-8937-0f97023ba959" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826891 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" containerName="extract-content" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826898 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" containerName="extract-content" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826911 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826918 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826929 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa32c669-499d-4df6-b58f-0fc9680ac7b2" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826936 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa32c669-499d-4df6-b58f-0fc9680ac7b2" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826949 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c7493c-1258-4bdd-a009-525474fe9aed" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826958 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c7493c-1258-4bdd-a009-525474fe9aed" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826970 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68bda3b6-c13a-4843-88be-c192ea6c8777" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826978 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bda3b6-c13a-4843-88be-c192ea6c8777" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.826987 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ac425f-438a-4797-8689-86ca94810696" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.826994 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ac425f-438a-4797-8689-86ca94810696" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: E0125 05:53:29.827011 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" containerName="extract-utilities" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827017 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" containerName="extract-utilities" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827462 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ac425f-438a-4797-8689-86ca94810696" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827486 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="68bda3b6-c13a-4843-88be-c192ea6c8777" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827499 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b680d31f-ff1c-460a-8937-0f97023ba959" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827510 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa32c669-499d-4df6-b58f-0fc9680ac7b2" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827521 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" containerName="registry-server" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827531 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" containerName="registry-server" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827547 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c7493c-1258-4bdd-a009-525474fe9aed" containerName="mariadb-database-create" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827554 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b49ae9-5410-4a73-a256-ea55dd3d1bfe" containerName="dnsmasq-dns" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827563 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.827573 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec75869-ff42-4ec7-b69b-da9d72fe052a" containerName="mariadb-account-create-update" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.829268 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.831809 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.836283 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7r9b2"] Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.876761 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbvfp\" (UniqueName: \"kubernetes.io/projected/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-kube-api-access-zbvfp\") pod \"root-account-create-update-7r9b2\" (UID: \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\") " pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.876824 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-operator-scripts\") pod \"root-account-create-update-7r9b2\" (UID: \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\") " pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.977978 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbvfp\" (UniqueName: \"kubernetes.io/projected/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-kube-api-access-zbvfp\") pod \"root-account-create-update-7r9b2\" (UID: \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\") " pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.978032 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-operator-scripts\") pod \"root-account-create-update-7r9b2\" (UID: \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\") " pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:29 crc kubenswrapper[4728]: I0125 05:53:29.978864 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-operator-scripts\") pod \"root-account-create-update-7r9b2\" (UID: \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\") " pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:30 crc kubenswrapper[4728]: I0125 05:53:30.000140 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbvfp\" (UniqueName: \"kubernetes.io/projected/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-kube-api-access-zbvfp\") pod \"root-account-create-update-7r9b2\" (UID: \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\") " pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:30 crc kubenswrapper[4728]: I0125 05:53:30.153526 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:30 crc kubenswrapper[4728]: I0125 05:53:30.571907 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7r9b2"] Jan 25 05:53:30 crc kubenswrapper[4728]: W0125 05:53:30.580586 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eaf8418_c302_42cc_bd8e_a0797df4a1a1.slice/crio-9c24faabdb9dea4e64b3467cecd63a4e5dd1a21d9278ffe22530e715881c85c1 WatchSource:0}: Error finding container 9c24faabdb9dea4e64b3467cecd63a4e5dd1a21d9278ffe22530e715881c85c1: Status 404 returned error can't find the container with id 9c24faabdb9dea4e64b3467cecd63a4e5dd1a21d9278ffe22530e715881c85c1 Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.169191 4728 generic.go:334] "Generic (PLEG): container finished" podID="4eaf8418-c302-42cc-bd8e-a0797df4a1a1" containerID="2e7ba0b9a2b6ea3e9ce3a9d0a2d5cdb7c7fc910b1849cc30580dc23438151e33" exitCode=0 Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.169363 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7r9b2" event={"ID":"4eaf8418-c302-42cc-bd8e-a0797df4a1a1","Type":"ContainerDied","Data":"2e7ba0b9a2b6ea3e9ce3a9d0a2d5cdb7c7fc910b1849cc30580dc23438151e33"} Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.169475 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7r9b2" event={"ID":"4eaf8418-c302-42cc-bd8e-a0797df4a1a1","Type":"ContainerStarted","Data":"9c24faabdb9dea4e64b3467cecd63a4e5dd1a21d9278ffe22530e715881c85c1"} Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.338154 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d" path="/var/lib/kubelet/pods/ce55d0de-93a4-4e26-9e19-2a1cf83f2c4d/volumes" Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.338671 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f132cd80-c760-445f-b6bf-41d35700b35c" path="/var/lib/kubelet/pods/f132cd80-c760-445f-b6bf-41d35700b35c/volumes" Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.339397 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9aaed5-97ec-40d8-97dc-9783fd1c682f" path="/var/lib/kubelet/pods/fb9aaed5-97ec-40d8-97dc-9783fd1c682f/volumes" Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.402695 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.410060 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c74720-9ea2-42cd-93d6-1c17ede15e62-etc-swift\") pod \"swift-storage-0\" (UID: \"f3c74720-9ea2-42cd-93d6-1c17ede15e62\") " pod="openstack/swift-storage-0" Jan 25 05:53:31 crc kubenswrapper[4728]: I0125 05:53:31.663513 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.159007 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 25 05:53:32 crc kubenswrapper[4728]: W0125 05:53:32.169468 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3c74720_9ea2_42cd_93d6_1c17ede15e62.slice/crio-992e72dd30d0787151fbcfab9d679845c423af21ff4ca9e7f583bd8848a8f939 WatchSource:0}: Error finding container 992e72dd30d0787151fbcfab9d679845c423af21ff4ca9e7f583bd8848a8f939: Status 404 returned error can't find the container with id 992e72dd30d0787151fbcfab9d679845c423af21ff4ca9e7f583bd8848a8f939 Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.177623 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"992e72dd30d0787151fbcfab9d679845c423af21ff4ca9e7f583bd8848a8f939"} Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.179294 4728 generic.go:334] "Generic (PLEG): container finished" podID="747ed3cf-861f-46d7-8411-3c3318fbff34" containerID="4fc3b8c2499c9361965e470f0a3e35ca515d24994ae4c5844cbaa0e40f3fa638" exitCode=0 Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.179404 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-575nj" event={"ID":"747ed3cf-861f-46d7-8411-3c3318fbff34","Type":"ContainerDied","Data":"4fc3b8c2499c9361965e470f0a3e35ca515d24994ae4c5844cbaa0e40f3fa638"} Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.472737 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.521983 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-operator-scripts\") pod \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\" (UID: \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\") " Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.522127 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbvfp\" (UniqueName: \"kubernetes.io/projected/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-kube-api-access-zbvfp\") pod \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\" (UID: \"4eaf8418-c302-42cc-bd8e-a0797df4a1a1\") " Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.523421 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4eaf8418-c302-42cc-bd8e-a0797df4a1a1" (UID: "4eaf8418-c302-42cc-bd8e-a0797df4a1a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.529225 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-kube-api-access-zbvfp" (OuterVolumeSpecName: "kube-api-access-zbvfp") pod "4eaf8418-c302-42cc-bd8e-a0797df4a1a1" (UID: "4eaf8418-c302-42cc-bd8e-a0797df4a1a1"). InnerVolumeSpecName "kube-api-access-zbvfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.623577 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:32 crc kubenswrapper[4728]: I0125 05:53:32.623613 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbvfp\" (UniqueName: \"kubernetes.io/projected/4eaf8418-c302-42cc-bd8e-a0797df4a1a1-kube-api-access-zbvfp\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.189578 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7r9b2" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.189553 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7r9b2" event={"ID":"4eaf8418-c302-42cc-bd8e-a0797df4a1a1","Type":"ContainerDied","Data":"9c24faabdb9dea4e64b3467cecd63a4e5dd1a21d9278ffe22530e715881c85c1"} Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.189955 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c24faabdb9dea4e64b3467cecd63a4e5dd1a21d9278ffe22530e715881c85c1" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.446732 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-w8ld9"] Jan 25 05:53:33 crc kubenswrapper[4728]: E0125 05:53:33.447040 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eaf8418-c302-42cc-bd8e-a0797df4a1a1" containerName="mariadb-account-create-update" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.447060 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eaf8418-c302-42cc-bd8e-a0797df4a1a1" containerName="mariadb-account-create-update" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.447426 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eaf8418-c302-42cc-bd8e-a0797df4a1a1" containerName="mariadb-account-create-update" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.447856 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.452918 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.453689 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jscv9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.460027 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w8ld9"] Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.546131 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-db-sync-config-data\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.546191 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-config-data\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.546315 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wzb\" (UniqueName: \"kubernetes.io/projected/db13ac9b-f01e-42d8-b455-db929ef4b64c-kube-api-access-m7wzb\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.546386 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-combined-ca-bundle\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.584273 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.647850 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/747ed3cf-861f-46d7-8411-3c3318fbff34-etc-swift\") pod \"747ed3cf-861f-46d7-8411-3c3318fbff34\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.647907 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-combined-ca-bundle\") pod \"747ed3cf-861f-46d7-8411-3c3318fbff34\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.647953 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-scripts\") pod \"747ed3cf-861f-46d7-8411-3c3318fbff34\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.648086 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-ring-data-devices\") pod \"747ed3cf-861f-46d7-8411-3c3318fbff34\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.648118 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-dispersionconf\") pod \"747ed3cf-861f-46d7-8411-3c3318fbff34\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.648191 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxzlf\" (UniqueName: \"kubernetes.io/projected/747ed3cf-861f-46d7-8411-3c3318fbff34-kube-api-access-zxzlf\") pod \"747ed3cf-861f-46d7-8411-3c3318fbff34\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.648313 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-swiftconf\") pod \"747ed3cf-861f-46d7-8411-3c3318fbff34\" (UID: \"747ed3cf-861f-46d7-8411-3c3318fbff34\") " Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.648847 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-config-data\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.649451 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wzb\" (UniqueName: \"kubernetes.io/projected/db13ac9b-f01e-42d8-b455-db929ef4b64c-kube-api-access-m7wzb\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.649486 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-combined-ca-bundle\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.649663 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-db-sync-config-data\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.648832 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/747ed3cf-861f-46d7-8411-3c3318fbff34-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "747ed3cf-861f-46d7-8411-3c3318fbff34" (UID: "747ed3cf-861f-46d7-8411-3c3318fbff34"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.649387 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "747ed3cf-861f-46d7-8411-3c3318fbff34" (UID: "747ed3cf-861f-46d7-8411-3c3318fbff34"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.655135 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-config-data\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.658501 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "747ed3cf-861f-46d7-8411-3c3318fbff34" (UID: "747ed3cf-861f-46d7-8411-3c3318fbff34"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.658864 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-combined-ca-bundle\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.662720 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-db-sync-config-data\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.663205 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/747ed3cf-861f-46d7-8411-3c3318fbff34-kube-api-access-zxzlf" (OuterVolumeSpecName: "kube-api-access-zxzlf") pod "747ed3cf-861f-46d7-8411-3c3318fbff34" (UID: "747ed3cf-861f-46d7-8411-3c3318fbff34"). InnerVolumeSpecName "kube-api-access-zxzlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.665210 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wzb\" (UniqueName: \"kubernetes.io/projected/db13ac9b-f01e-42d8-b455-db929ef4b64c-kube-api-access-m7wzb\") pod \"glance-db-sync-w8ld9\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.671886 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-scripts" (OuterVolumeSpecName: "scripts") pod "747ed3cf-861f-46d7-8411-3c3318fbff34" (UID: "747ed3cf-861f-46d7-8411-3c3318fbff34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.672221 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "747ed3cf-861f-46d7-8411-3c3318fbff34" (UID: "747ed3cf-861f-46d7-8411-3c3318fbff34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.679558 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "747ed3cf-861f-46d7-8411-3c3318fbff34" (UID: "747ed3cf-861f-46d7-8411-3c3318fbff34"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.751958 4728 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.751983 4728 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.751996 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxzlf\" (UniqueName: \"kubernetes.io/projected/747ed3cf-861f-46d7-8411-3c3318fbff34-kube-api-access-zxzlf\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.752007 4728 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.752014 4728 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/747ed3cf-861f-46d7-8411-3c3318fbff34-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.752023 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ed3cf-861f-46d7-8411-3c3318fbff34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.752031 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/747ed3cf-861f-46d7-8411-3c3318fbff34-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:33 crc kubenswrapper[4728]: I0125 05:53:33.770185 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:34 crc kubenswrapper[4728]: I0125 05:53:34.199411 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-575nj" event={"ID":"747ed3cf-861f-46d7-8411-3c3318fbff34","Type":"ContainerDied","Data":"ac5f13e5bc7d021ef3fbb85ccbc765991c58b1467fd41cf4f265d0ec6ac56761"} Jan 25 05:53:34 crc kubenswrapper[4728]: I0125 05:53:34.199509 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac5f13e5bc7d021ef3fbb85ccbc765991c58b1467fd41cf4f265d0ec6ac56761" Jan 25 05:53:34 crc kubenswrapper[4728]: I0125 05:53:34.199459 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-575nj" Jan 25 05:53:34 crc kubenswrapper[4728]: I0125 05:53:34.247152 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w8ld9"] Jan 25 05:53:35 crc kubenswrapper[4728]: I0125 05:53:35.208640 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w8ld9" event={"ID":"db13ac9b-f01e-42d8-b455-db929ef4b64c","Type":"ContainerStarted","Data":"4830b16106f88b0a9deda0d5573143bcc9c7ecf5fea9b046f0354f9959dff596"} Jan 25 05:53:35 crc kubenswrapper[4728]: I0125 05:53:35.741996 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 25 05:53:36 crc kubenswrapper[4728]: I0125 05:53:36.226368 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"cac217ab157a7e3fbfae4772a12bbf53e3c6f3bc7e488a279dd9a200cafd36d2"} Jan 25 05:53:36 crc kubenswrapper[4728]: I0125 05:53:36.226438 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"b09f984f214c698c8f42f2c70f8fcc9307ee9d7605e20ec6849e46b02f894253"} Jan 25 05:53:36 crc kubenswrapper[4728]: I0125 05:53:36.226450 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"3e0b3f8308c9b9272048b3f8605deb9ff49556a0b067fbe2f3f86c180590f641"} Jan 25 05:53:36 crc kubenswrapper[4728]: I0125 05:53:36.226471 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"8a0ea41ed067344f523d88e30b6e4de2e8e7df5dc45bfb20de6378cf493884aa"} Jan 25 05:53:37 crc kubenswrapper[4728]: I0125 05:53:37.240477 4728 generic.go:334] "Generic (PLEG): container finished" podID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerID="c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e" exitCode=0 Jan 25 05:53:37 crc kubenswrapper[4728]: I0125 05:53:37.240576 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ddd3d99e-20c0-4133-9537-413f83a04edb","Type":"ContainerDied","Data":"c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e"} Jan 25 05:53:37 crc kubenswrapper[4728]: I0125 05:53:37.248016 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"8f9223465a0db5bbaecfada06b8ffdfe7af563c4665d0e565e92033e965e1b04"} Jan 25 05:53:37 crc kubenswrapper[4728]: I0125 05:53:37.250102 4728 generic.go:334] "Generic (PLEG): container finished" podID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerID="245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e" exitCode=0 Jan 25 05:53:37 crc kubenswrapper[4728]: I0125 05:53:37.250128 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2","Type":"ContainerDied","Data":"245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e"} Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.269812 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2","Type":"ContainerStarted","Data":"14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1"} Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.270339 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.275073 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ddd3d99e-20c0-4133-9537-413f83a04edb","Type":"ContainerStarted","Data":"1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77"} Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.275454 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.281248 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"8540b3642420f2d0f5088d10741bdd456035e367323297663509d5fe1182fd33"} Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.281339 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"9dbc5279f3b8773afd9cb427782dffcafc945cee01b38b747e3a4a758c9d38a7"} Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.281355 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"45793668debd284ecca2e74fad37530546bc571b29e874c171cc8a4f70920e9d"} Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.293962 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.21248765 podStartE2EDuration="1m0.29394992s" podCreationTimestamp="2026-01-25 05:52:38 +0000 UTC" firstStartedPulling="2026-01-25 05:52:40.193924402 +0000 UTC m=+851.229802382" lastFinishedPulling="2026-01-25 05:53:04.275386673 +0000 UTC m=+875.311264652" observedRunningTime="2026-01-25 05:53:38.286274506 +0000 UTC m=+909.322152487" watchObservedRunningTime="2026-01-25 05:53:38.29394992 +0000 UTC m=+909.329827900" Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.307157 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.923397907 podStartE2EDuration="1m0.30714964s" podCreationTimestamp="2026-01-25 05:52:38 +0000 UTC" firstStartedPulling="2026-01-25 05:52:40.163495886 +0000 UTC m=+851.199373866" lastFinishedPulling="2026-01-25 05:53:04.54724762 +0000 UTC m=+875.583125599" observedRunningTime="2026-01-25 05:53:38.303479979 +0000 UTC m=+909.339357960" watchObservedRunningTime="2026-01-25 05:53:38.30714964 +0000 UTC m=+909.343027620" Jan 25 05:53:38 crc kubenswrapper[4728]: I0125 05:53:38.449980 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6x4kp" podUID="193b75ba-c337-4422-88ce-aace97ac7638" containerName="ovn-controller" probeResult="failure" output=< Jan 25 05:53:38 crc kubenswrapper[4728]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 25 05:53:38 crc kubenswrapper[4728]: > Jan 25 05:53:39 crc kubenswrapper[4728]: E0125 05:53:39.443447 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice/crio-0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice/crio-30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8\": RecentStats: unable to find data in memory cache]" Jan 25 05:53:40 crc kubenswrapper[4728]: I0125 05:53:40.304867 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"b53dd19a18052a1776b9c249b7d80b484c0facae69c548ac42341abbdd701c5f"} Jan 25 05:53:40 crc kubenswrapper[4728]: I0125 05:53:40.305104 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"d2f3e686b5e747ea64b9ee32f8aaf33f74be2b6681f47637c4c822feff91c6da"} Jan 25 05:53:40 crc kubenswrapper[4728]: I0125 05:53:40.305115 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"900315273e71db69eb394d318c06ca9c7106e55a9087d9d62386f0db2f853feb"} Jan 25 05:53:40 crc kubenswrapper[4728]: I0125 05:53:40.305123 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"a83413427d56a9fe4a5c7a547e84ed37900532f1881bf6c7e80587a807998082"} Jan 25 05:53:40 crc kubenswrapper[4728]: I0125 05:53:40.305131 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"8eb84361982650abfd9ff951811d158e0641f86e982695a1077f84f8f7b76e04"} Jan 25 05:53:40 crc kubenswrapper[4728]: I0125 05:53:40.305138 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"b425109cfc857145ff2451aad30afb49e3a0e4bf861c5663a8667cbbe76161b8"} Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.319679 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f3c74720-9ea2-42cd-93d6-1c17ede15e62","Type":"ContainerStarted","Data":"52dfcdc3337c19ffbcdd775ac27f3173900a32bdb66d66266f8b111088da6906"} Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.359158 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.145448509 podStartE2EDuration="27.3591417s" podCreationTimestamp="2026-01-25 05:53:14 +0000 UTC" firstStartedPulling="2026-01-25 05:53:32.171837336 +0000 UTC m=+903.207715316" lastFinishedPulling="2026-01-25 05:53:39.385530526 +0000 UTC m=+910.421408507" observedRunningTime="2026-01-25 05:53:41.349207528 +0000 UTC m=+912.385085507" watchObservedRunningTime="2026-01-25 05:53:41.3591417 +0000 UTC m=+912.395019680" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.590355 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f45b9f4bc-g92ls"] Jan 25 05:53:41 crc kubenswrapper[4728]: E0125 05:53:41.590685 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747ed3cf-861f-46d7-8411-3c3318fbff34" containerName="swift-ring-rebalance" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.590703 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="747ed3cf-861f-46d7-8411-3c3318fbff34" containerName="swift-ring-rebalance" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.590892 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="747ed3cf-861f-46d7-8411-3c3318fbff34" containerName="swift-ring-rebalance" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.591659 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.593906 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.607587 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f45b9f4bc-g92ls"] Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.792110 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-config\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.792178 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgbmh\" (UniqueName: \"kubernetes.io/projected/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-kube-api-access-hgbmh\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.792285 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-sb\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.792355 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-swift-storage-0\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.792522 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-nb\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.792548 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-svc\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.893608 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-nb\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.893917 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-svc\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.894082 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-config\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.894206 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgbmh\" (UniqueName: \"kubernetes.io/projected/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-kube-api-access-hgbmh\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.894314 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-sb\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.894455 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-swift-storage-0\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.894615 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-nb\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.894626 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-svc\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.895078 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-sb\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.895277 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-swift-storage-0\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.895347 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-config\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.915810 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgbmh\" (UniqueName: \"kubernetes.io/projected/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-kube-api-access-hgbmh\") pod \"dnsmasq-dns-f45b9f4bc-g92ls\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:41 crc kubenswrapper[4728]: I0125 05:53:41.919108 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:42 crc kubenswrapper[4728]: I0125 05:53:42.369165 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f45b9f4bc-g92ls"] Jan 25 05:53:42 crc kubenswrapper[4728]: W0125 05:53:42.376096 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8aebef0_3647_4ca2_a703_4d0c2033c7fd.slice/crio-a88c3507dbfac6a89410f30fc25bcd17f0e6762d29e4ac3a388b7315cec1887b WatchSource:0}: Error finding container a88c3507dbfac6a89410f30fc25bcd17f0e6762d29e4ac3a388b7315cec1887b: Status 404 returned error can't find the container with id a88c3507dbfac6a89410f30fc25bcd17f0e6762d29e4ac3a388b7315cec1887b Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.334488 4728 generic.go:334] "Generic (PLEG): container finished" podID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" containerID="e5ad7ccaa64be3ac12d1755bce860c7140dbc38b021130af937cc88b1d052130" exitCode=0 Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.337166 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" event={"ID":"a8aebef0-3647-4ca2-a703-4d0c2033c7fd","Type":"ContainerDied","Data":"e5ad7ccaa64be3ac12d1755bce860c7140dbc38b021130af937cc88b1d052130"} Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.337203 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" event={"ID":"a8aebef0-3647-4ca2-a703-4d0c2033c7fd","Type":"ContainerStarted","Data":"a88c3507dbfac6a89410f30fc25bcd17f0e6762d29e4ac3a388b7315cec1887b"} Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.441062 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6x4kp" podUID="193b75ba-c337-4422-88ce-aace97ac7638" containerName="ovn-controller" probeResult="failure" output=< Jan 25 05:53:43 crc kubenswrapper[4728]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 25 05:53:43 crc kubenswrapper[4728]: > Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.462407 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.463639 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mr9hh" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.777592 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6x4kp-config-6jdqt"] Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.778690 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.783066 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.788357 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6x4kp-config-6jdqt"] Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.930113 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run-ovn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.930164 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-additional-scripts\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.930247 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-scripts\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.930280 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.930311 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtpn\" (UniqueName: \"kubernetes.io/projected/472f53a9-932e-4244-9136-9249f9f0e3ce-kube-api-access-cmtpn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:43 crc kubenswrapper[4728]: I0125 05:53:43.930382 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-log-ovn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.032552 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-additional-scripts\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.032652 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-scripts\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.032684 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.032725 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtpn\" (UniqueName: \"kubernetes.io/projected/472f53a9-932e-4244-9136-9249f9f0e3ce-kube-api-access-cmtpn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.032751 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-log-ovn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.032801 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run-ovn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.033117 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run-ovn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.033436 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-additional-scripts\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.033804 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.033855 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-log-ovn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.034866 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-scripts\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.054555 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtpn\" (UniqueName: \"kubernetes.io/projected/472f53a9-932e-4244-9136-9249f9f0e3ce-kube-api-access-cmtpn\") pod \"ovn-controller-6x4kp-config-6jdqt\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.093862 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.345437 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" event={"ID":"a8aebef0-3647-4ca2-a703-4d0c2033c7fd","Type":"ContainerStarted","Data":"a2ad86531326a1df76df40666b7710cca44a224060a7715e3e62fabb803c797b"} Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.345783 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.364013 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" podStartSLOduration=3.36400349 podStartE2EDuration="3.36400349s" podCreationTimestamp="2026-01-25 05:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:53:44.357701277 +0000 UTC m=+915.393579257" watchObservedRunningTime="2026-01-25 05:53:44.36400349 +0000 UTC m=+915.399881469" Jan 25 05:53:44 crc kubenswrapper[4728]: I0125 05:53:44.517864 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6x4kp-config-6jdqt"] Jan 25 05:53:48 crc kubenswrapper[4728]: I0125 05:53:48.438854 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6x4kp" podUID="193b75ba-c337-4422-88ce-aace97ac7638" containerName="ovn-controller" probeResult="failure" output=< Jan 25 05:53:48 crc kubenswrapper[4728]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 25 05:53:48 crc kubenswrapper[4728]: > Jan 25 05:53:49 crc kubenswrapper[4728]: W0125 05:53:49.555133 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod472f53a9_932e_4244_9136_9249f9f0e3ce.slice/crio-d1cf016ab19b462e007b578ba7cacb4df13e80fe321168f95ef76734452bad1e WatchSource:0}: Error finding container d1cf016ab19b462e007b578ba7cacb4df13e80fe321168f95ef76734452bad1e: Status 404 returned error can't find the container with id d1cf016ab19b462e007b578ba7cacb4df13e80fe321168f95ef76734452bad1e Jan 25 05:53:49 crc kubenswrapper[4728]: E0125 05:53:49.610647 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice/crio-30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice/crio-0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice\": RecentStats: unable to find data in memory cache]" Jan 25 05:53:49 crc kubenswrapper[4728]: I0125 05:53:49.708509 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 25 05:53:49 crc kubenswrapper[4728]: I0125 05:53:49.988495 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.001780 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-704e-account-create-update-9vw4c"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.002858 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.009144 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.052829 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-704e-account-create-update-9vw4c"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.076260 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kq85j"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.077265 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.087134 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kq85j"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.129607 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dllgm"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.131417 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.137350 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0621-account-create-update-6q629"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.138182 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.141359 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjtdp\" (UniqueName: \"kubernetes.io/projected/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-kube-api-access-bjtdp\") pod \"cinder-704e-account-create-update-9vw4c\" (UID: \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\") " pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.141441 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-operator-scripts\") pod \"cinder-704e-account-create-update-9vw4c\" (UID: \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\") " pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.142483 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dllgm"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.143576 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.148356 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0621-account-create-update-6q629"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.244255 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-operator-scripts\") pod \"cinder-704e-account-create-update-9vw4c\" (UID: \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\") " pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.244667 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a53ae7b-d679-4ae7-a6c6-d3465781c613-operator-scripts\") pod \"barbican-0621-account-create-update-6q629\" (UID: \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\") " pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.244705 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m88w\" (UniqueName: \"kubernetes.io/projected/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-kube-api-access-2m88w\") pod \"barbican-db-create-dllgm\" (UID: \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\") " pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.244751 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mct4m\" (UniqueName: \"kubernetes.io/projected/0995cb23-3429-4757-95d1-7f48216b7dce-kube-api-access-mct4m\") pod \"cinder-db-create-kq85j\" (UID: \"0995cb23-3429-4757-95d1-7f48216b7dce\") " pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.244967 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f96c\" (UniqueName: \"kubernetes.io/projected/6a53ae7b-d679-4ae7-a6c6-d3465781c613-kube-api-access-2f96c\") pod \"barbican-0621-account-create-update-6q629\" (UID: \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\") " pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.245038 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-operator-scripts\") pod \"barbican-db-create-dllgm\" (UID: \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\") " pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.245178 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjtdp\" (UniqueName: \"kubernetes.io/projected/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-kube-api-access-bjtdp\") pod \"cinder-704e-account-create-update-9vw4c\" (UID: \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\") " pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.245309 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0995cb23-3429-4757-95d1-7f48216b7dce-operator-scripts\") pod \"cinder-db-create-kq85j\" (UID: \"0995cb23-3429-4757-95d1-7f48216b7dce\") " pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.245313 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-operator-scripts\") pod \"cinder-704e-account-create-update-9vw4c\" (UID: \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\") " pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.262140 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-r67j5"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.263222 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.265239 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.265556 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.265737 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lk66m" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.265924 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.267925 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjtdp\" (UniqueName: \"kubernetes.io/projected/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-kube-api-access-bjtdp\") pod \"cinder-704e-account-create-update-9vw4c\" (UID: \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\") " pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.275562 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-r67j5"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.318055 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qh5qp"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.319230 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.326004 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qh5qp"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.327585 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.347772 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0995cb23-3429-4757-95d1-7f48216b7dce-operator-scripts\") pod \"cinder-db-create-kq85j\" (UID: \"0995cb23-3429-4757-95d1-7f48216b7dce\") " pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.347986 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmtf\" (UniqueName: \"kubernetes.io/projected/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-kube-api-access-4vmtf\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.348086 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a53ae7b-d679-4ae7-a6c6-d3465781c613-operator-scripts\") pod \"barbican-0621-account-create-update-6q629\" (UID: \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\") " pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.348155 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m88w\" (UniqueName: \"kubernetes.io/projected/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-kube-api-access-2m88w\") pod \"barbican-db-create-dllgm\" (UID: \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\") " pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.348222 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-config-data\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.348305 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mct4m\" (UniqueName: \"kubernetes.io/projected/0995cb23-3429-4757-95d1-7f48216b7dce-kube-api-access-mct4m\") pod \"cinder-db-create-kq85j\" (UID: \"0995cb23-3429-4757-95d1-7f48216b7dce\") " pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.348418 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-combined-ca-bundle\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.348516 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f96c\" (UniqueName: \"kubernetes.io/projected/6a53ae7b-d679-4ae7-a6c6-d3465781c613-kube-api-access-2f96c\") pod \"barbican-0621-account-create-update-6q629\" (UID: \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\") " pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.348604 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-operator-scripts\") pod \"barbican-db-create-dllgm\" (UID: \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\") " pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.349303 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-operator-scripts\") pod \"barbican-db-create-dllgm\" (UID: \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\") " pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.349303 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a53ae7b-d679-4ae7-a6c6-d3465781c613-operator-scripts\") pod \"barbican-0621-account-create-update-6q629\" (UID: \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\") " pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.348528 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0995cb23-3429-4757-95d1-7f48216b7dce-operator-scripts\") pod \"cinder-db-create-kq85j\" (UID: \"0995cb23-3429-4757-95d1-7f48216b7dce\") " pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.365304 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mct4m\" (UniqueName: \"kubernetes.io/projected/0995cb23-3429-4757-95d1-7f48216b7dce-kube-api-access-mct4m\") pod \"cinder-db-create-kq85j\" (UID: \"0995cb23-3429-4757-95d1-7f48216b7dce\") " pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.365628 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f96c\" (UniqueName: \"kubernetes.io/projected/6a53ae7b-d679-4ae7-a6c6-d3465781c613-kube-api-access-2f96c\") pod \"barbican-0621-account-create-update-6q629\" (UID: \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\") " pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.365891 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m88w\" (UniqueName: \"kubernetes.io/projected/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-kube-api-access-2m88w\") pod \"barbican-db-create-dllgm\" (UID: \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\") " pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.395985 4728 generic.go:334] "Generic (PLEG): container finished" podID="472f53a9-932e-4244-9136-9249f9f0e3ce" containerID="0b02f2609214d5e0a973fa589ad8eed9a37401113ec309b20c498e1baab1ead5" exitCode=0 Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.396779 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6x4kp-config-6jdqt" event={"ID":"472f53a9-932e-4244-9136-9249f9f0e3ce","Type":"ContainerDied","Data":"0b02f2609214d5e0a973fa589ad8eed9a37401113ec309b20c498e1baab1ead5"} Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.396842 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6x4kp-config-6jdqt" event={"ID":"472f53a9-932e-4244-9136-9249f9f0e3ce","Type":"ContainerStarted","Data":"d1cf016ab19b462e007b578ba7cacb4df13e80fe321168f95ef76734452bad1e"} Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.397385 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.399147 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w8ld9" event={"ID":"db13ac9b-f01e-42d8-b455-db929ef4b64c","Type":"ContainerStarted","Data":"153c3e929ee527e8d0eefcd225e5a333815034615f90b9d17a201bdf5dc4b832"} Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.434084 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b6e8-account-create-update-pg7cx"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.435519 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.438519 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.438976 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-w8ld9" podStartSLOduration=2.089203453 podStartE2EDuration="17.438953985s" podCreationTimestamp="2026-01-25 05:53:33 +0000 UTC" firstStartedPulling="2026-01-25 05:53:34.254905443 +0000 UTC m=+905.290783422" lastFinishedPulling="2026-01-25 05:53:49.604655974 +0000 UTC m=+920.640533954" observedRunningTime="2026-01-25 05:53:50.434384287 +0000 UTC m=+921.470262257" watchObservedRunningTime="2026-01-25 05:53:50.438953985 +0000 UTC m=+921.474831964" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.448779 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.450086 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-combined-ca-bundle\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.450168 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7319f4f1-86d5-4681-ba6c-012c0f3039ac-operator-scripts\") pod \"neutron-db-create-qh5qp\" (UID: \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\") " pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.450206 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ml78\" (UniqueName: \"kubernetes.io/projected/7319f4f1-86d5-4681-ba6c-012c0f3039ac-kube-api-access-6ml78\") pod \"neutron-db-create-qh5qp\" (UID: \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\") " pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.450237 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmtf\" (UniqueName: \"kubernetes.io/projected/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-kube-api-access-4vmtf\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.450272 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-config-data\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.454232 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-combined-ca-bundle\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.460097 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b6e8-account-create-update-pg7cx"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.460939 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-config-data\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.469668 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmtf\" (UniqueName: \"kubernetes.io/projected/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-kube-api-access-4vmtf\") pod \"keystone-db-sync-r67j5\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.473588 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.552160 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ml78\" (UniqueName: \"kubernetes.io/projected/7319f4f1-86d5-4681-ba6c-012c0f3039ac-kube-api-access-6ml78\") pod \"neutron-db-create-qh5qp\" (UID: \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\") " pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.552309 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f56fg\" (UniqueName: \"kubernetes.io/projected/803f99c7-af4a-4c8a-99ac-42a58563c3d2-kube-api-access-f56fg\") pod \"neutron-b6e8-account-create-update-pg7cx\" (UID: \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\") " pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.552372 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803f99c7-af4a-4c8a-99ac-42a58563c3d2-operator-scripts\") pod \"neutron-b6e8-account-create-update-pg7cx\" (UID: \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\") " pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.552476 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7319f4f1-86d5-4681-ba6c-012c0f3039ac-operator-scripts\") pod \"neutron-db-create-qh5qp\" (UID: \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\") " pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.553288 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7319f4f1-86d5-4681-ba6c-012c0f3039ac-operator-scripts\") pod \"neutron-db-create-qh5qp\" (UID: \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\") " pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.573287 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ml78\" (UniqueName: \"kubernetes.io/projected/7319f4f1-86d5-4681-ba6c-012c0f3039ac-kube-api-access-6ml78\") pod \"neutron-db-create-qh5qp\" (UID: \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\") " pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.577239 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r67j5" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.630862 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.654126 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803f99c7-af4a-4c8a-99ac-42a58563c3d2-operator-scripts\") pod \"neutron-b6e8-account-create-update-pg7cx\" (UID: \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\") " pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.654285 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f56fg\" (UniqueName: \"kubernetes.io/projected/803f99c7-af4a-4c8a-99ac-42a58563c3d2-kube-api-access-f56fg\") pod \"neutron-b6e8-account-create-update-pg7cx\" (UID: \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\") " pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.655277 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803f99c7-af4a-4c8a-99ac-42a58563c3d2-operator-scripts\") pod \"neutron-b6e8-account-create-update-pg7cx\" (UID: \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\") " pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.674043 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f56fg\" (UniqueName: \"kubernetes.io/projected/803f99c7-af4a-4c8a-99ac-42a58563c3d2-kube-api-access-f56fg\") pod \"neutron-b6e8-account-create-update-pg7cx\" (UID: \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\") " pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.754775 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.777952 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-704e-account-create-update-9vw4c"] Jan 25 05:53:50 crc kubenswrapper[4728]: W0125 05:53:50.783269 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe31ea3f_183c_4282_bf47_2fc7e17ab5b0.slice/crio-ffe651d9e38bfeff50e0aa71bb68663b635b52d0786ec4f2dea73202fa89c923 WatchSource:0}: Error finding container ffe651d9e38bfeff50e0aa71bb68663b635b52d0786ec4f2dea73202fa89c923: Status 404 returned error can't find the container with id ffe651d9e38bfeff50e0aa71bb68663b635b52d0786ec4f2dea73202fa89c923 Jan 25 05:53:50 crc kubenswrapper[4728]: W0125 05:53:50.933198 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0995cb23_3429_4757_95d1_7f48216b7dce.slice/crio-1ed8cbcb86beb7f012dc4b0154f895bf33f1bb59226597110569fd0925ef3a5b WatchSource:0}: Error finding container 1ed8cbcb86beb7f012dc4b0154f895bf33f1bb59226597110569fd0925ef3a5b: Status 404 returned error can't find the container with id 1ed8cbcb86beb7f012dc4b0154f895bf33f1bb59226597110569fd0925ef3a5b Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.950793 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dllgm"] Jan 25 05:53:50 crc kubenswrapper[4728]: I0125 05:53:50.964573 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kq85j"] Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.108778 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0621-account-create-update-6q629"] Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.121772 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-r67j5"] Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.127270 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qh5qp"] Jan 25 05:53:51 crc kubenswrapper[4728]: W0125 05:53:51.132023 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7319f4f1_86d5_4681_ba6c_012c0f3039ac.slice/crio-6400570f48b7e750e6b971b89dd57594c584c957963f6d709579df06e9c14d7e WatchSource:0}: Error finding container 6400570f48b7e750e6b971b89dd57594c584c957963f6d709579df06e9c14d7e: Status 404 returned error can't find the container with id 6400570f48b7e750e6b971b89dd57594c584c957963f6d709579df06e9c14d7e Jan 25 05:53:51 crc kubenswrapper[4728]: W0125 05:53:51.140048 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00b9902f_5c07_4a73_8a08_0a1c28e09fd8.slice/crio-403fad4e938a44317f5a12d94b831521f764c9ca4da8b6664c47c2e280daf665 WatchSource:0}: Error finding container 403fad4e938a44317f5a12d94b831521f764c9ca4da8b6664c47c2e280daf665: Status 404 returned error can't find the container with id 403fad4e938a44317f5a12d94b831521f764c9ca4da8b6664c47c2e280daf665 Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.310613 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b6e8-account-create-update-pg7cx"] Jan 25 05:53:51 crc kubenswrapper[4728]: W0125 05:53:51.317827 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod803f99c7_af4a_4c8a_99ac_42a58563c3d2.slice/crio-2b67b337d202cde03a0e6b8ada079733cfb181a6e57fa3db2db2af24fb546b64 WatchSource:0}: Error finding container 2b67b337d202cde03a0e6b8ada079733cfb181a6e57fa3db2db2af24fb546b64: Status 404 returned error can't find the container with id 2b67b337d202cde03a0e6b8ada079733cfb181a6e57fa3db2db2af24fb546b64 Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.408911 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qh5qp" event={"ID":"7319f4f1-86d5-4681-ba6c-012c0f3039ac","Type":"ContainerStarted","Data":"9254c641e6296bc21d55e1dfdc27a045033efaec56fddf85a1c1a79c8777b95a"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.408955 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qh5qp" event={"ID":"7319f4f1-86d5-4681-ba6c-012c0f3039ac","Type":"ContainerStarted","Data":"6400570f48b7e750e6b971b89dd57594c584c957963f6d709579df06e9c14d7e"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.413564 4728 generic.go:334] "Generic (PLEG): container finished" podID="d022c478-e9a3-40d9-b37e-54d2ba1e5ba1" containerID="7290a07d6a05d00f86f00dcd4ab47bd33b677d8ce2c9aca1e22ac4f532e78615" exitCode=0 Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.413678 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dllgm" event={"ID":"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1","Type":"ContainerDied","Data":"7290a07d6a05d00f86f00dcd4ab47bd33b677d8ce2c9aca1e22ac4f532e78615"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.413718 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dllgm" event={"ID":"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1","Type":"ContainerStarted","Data":"18b1dd7560e09c419769d8f41425178a5e03c2b5d171b4debd64a273a34d6336"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.416016 4728 generic.go:334] "Generic (PLEG): container finished" podID="be31ea3f-183c-4282-bf47-2fc7e17ab5b0" containerID="ab496c3faf6e958c6f7ddb49fd6d5b1978f109534141e8344a9401aa4eda745f" exitCode=0 Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.416080 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-704e-account-create-update-9vw4c" event={"ID":"be31ea3f-183c-4282-bf47-2fc7e17ab5b0","Type":"ContainerDied","Data":"ab496c3faf6e958c6f7ddb49fd6d5b1978f109534141e8344a9401aa4eda745f"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.416105 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-704e-account-create-update-9vw4c" event={"ID":"be31ea3f-183c-4282-bf47-2fc7e17ab5b0","Type":"ContainerStarted","Data":"ffe651d9e38bfeff50e0aa71bb68663b635b52d0786ec4f2dea73202fa89c923"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.417235 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0621-account-create-update-6q629" event={"ID":"6a53ae7b-d679-4ae7-a6c6-d3465781c613","Type":"ContainerStarted","Data":"35863c313e57052c1427c3c7d10d10a7664fc2dbaa5d801ca8fc1e6958f77af2"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.417276 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0621-account-create-update-6q629" event={"ID":"6a53ae7b-d679-4ae7-a6c6-d3465781c613","Type":"ContainerStarted","Data":"d3a8d7659c8f4fb982a4c3c83d60717b91ebcd8f267875049c7be8b3f4ef3b62"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.418135 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-r67j5" event={"ID":"00b9902f-5c07-4a73-8a08-0a1c28e09fd8","Type":"ContainerStarted","Data":"403fad4e938a44317f5a12d94b831521f764c9ca4da8b6664c47c2e280daf665"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.420468 4728 generic.go:334] "Generic (PLEG): container finished" podID="0995cb23-3429-4757-95d1-7f48216b7dce" containerID="34d7a4a61b459b8b560ca6a4a343deed5bd1033a3bb04f2f77b293561bdcdfe3" exitCode=0 Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.420525 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kq85j" event={"ID":"0995cb23-3429-4757-95d1-7f48216b7dce","Type":"ContainerDied","Data":"34d7a4a61b459b8b560ca6a4a343deed5bd1033a3bb04f2f77b293561bdcdfe3"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.420659 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kq85j" event={"ID":"0995cb23-3429-4757-95d1-7f48216b7dce","Type":"ContainerStarted","Data":"1ed8cbcb86beb7f012dc4b0154f895bf33f1bb59226597110569fd0925ef3a5b"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.422270 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6e8-account-create-update-pg7cx" event={"ID":"803f99c7-af4a-4c8a-99ac-42a58563c3d2","Type":"ContainerStarted","Data":"2b67b337d202cde03a0e6b8ada079733cfb181a6e57fa3db2db2af24fb546b64"} Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.432488 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-qh5qp" podStartSLOduration=1.432467274 podStartE2EDuration="1.432467274s" podCreationTimestamp="2026-01-25 05:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:53:51.426105899 +0000 UTC m=+922.461983879" watchObservedRunningTime="2026-01-25 05:53:51.432467274 +0000 UTC m=+922.468345254" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.463020 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0621-account-create-update-6q629" podStartSLOduration=1.463002942 podStartE2EDuration="1.463002942s" podCreationTimestamp="2026-01-25 05:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:53:51.457260915 +0000 UTC m=+922.493138895" watchObservedRunningTime="2026-01-25 05:53:51.463002942 +0000 UTC m=+922.498880922" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.789548 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.890544 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run\") pod \"472f53a9-932e-4244-9136-9249f9f0e3ce\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.890752 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-scripts\") pod \"472f53a9-932e-4244-9136-9249f9f0e3ce\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.890800 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-log-ovn\") pod \"472f53a9-932e-4244-9136-9249f9f0e3ce\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.890839 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-additional-scripts\") pod \"472f53a9-932e-4244-9136-9249f9f0e3ce\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.890898 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmtpn\" (UniqueName: \"kubernetes.io/projected/472f53a9-932e-4244-9136-9249f9f0e3ce-kube-api-access-cmtpn\") pod \"472f53a9-932e-4244-9136-9249f9f0e3ce\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.890946 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run-ovn\") pod \"472f53a9-932e-4244-9136-9249f9f0e3ce\" (UID: \"472f53a9-932e-4244-9136-9249f9f0e3ce\") " Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.891195 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "472f53a9-932e-4244-9136-9249f9f0e3ce" (UID: "472f53a9-932e-4244-9136-9249f9f0e3ce"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.891337 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "472f53a9-932e-4244-9136-9249f9f0e3ce" (UID: "472f53a9-932e-4244-9136-9249f9f0e3ce"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.891823 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "472f53a9-932e-4244-9136-9249f9f0e3ce" (UID: "472f53a9-932e-4244-9136-9249f9f0e3ce"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.891210 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run" (OuterVolumeSpecName: "var-run") pod "472f53a9-932e-4244-9136-9249f9f0e3ce" (UID: "472f53a9-932e-4244-9136-9249f9f0e3ce"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.892034 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-scripts" (OuterVolumeSpecName: "scripts") pod "472f53a9-932e-4244-9136-9249f9f0e3ce" (UID: "472f53a9-932e-4244-9136-9249f9f0e3ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.892368 4728 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.892454 4728 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-run\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.892525 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.892586 4728 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/472f53a9-932e-4244-9136-9249f9f0e3ce-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.892636 4728 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/472f53a9-932e-4244-9136-9249f9f0e3ce-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.896998 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472f53a9-932e-4244-9136-9249f9f0e3ce-kube-api-access-cmtpn" (OuterVolumeSpecName: "kube-api-access-cmtpn") pod "472f53a9-932e-4244-9136-9249f9f0e3ce" (UID: "472f53a9-932e-4244-9136-9249f9f0e3ce"). InnerVolumeSpecName "kube-api-access-cmtpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.920487 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.980896 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb45645f7-6b8qk"] Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.981131 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" podUID="f62a2680-ed4b-449b-925c-e243731ea8b4" containerName="dnsmasq-dns" containerID="cri-o://e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99" gracePeriod=10 Jan 25 05:53:51 crc kubenswrapper[4728]: I0125 05:53:51.997939 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmtpn\" (UniqueName: \"kubernetes.io/projected/472f53a9-932e-4244-9136-9249f9f0e3ce-kube-api-access-cmtpn\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.303902 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.407007 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mnjn\" (UniqueName: \"kubernetes.io/projected/f62a2680-ed4b-449b-925c-e243731ea8b4-kube-api-access-9mnjn\") pod \"f62a2680-ed4b-449b-925c-e243731ea8b4\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.407070 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-nb\") pod \"f62a2680-ed4b-449b-925c-e243731ea8b4\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.407160 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-config\") pod \"f62a2680-ed4b-449b-925c-e243731ea8b4\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.407185 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-dns-svc\") pod \"f62a2680-ed4b-449b-925c-e243731ea8b4\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.407237 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-sb\") pod \"f62a2680-ed4b-449b-925c-e243731ea8b4\" (UID: \"f62a2680-ed4b-449b-925c-e243731ea8b4\") " Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.412508 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62a2680-ed4b-449b-925c-e243731ea8b4-kube-api-access-9mnjn" (OuterVolumeSpecName: "kube-api-access-9mnjn") pod "f62a2680-ed4b-449b-925c-e243731ea8b4" (UID: "f62a2680-ed4b-449b-925c-e243731ea8b4"). InnerVolumeSpecName "kube-api-access-9mnjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.433555 4728 generic.go:334] "Generic (PLEG): container finished" podID="6a53ae7b-d679-4ae7-a6c6-d3465781c613" containerID="35863c313e57052c1427c3c7d10d10a7664fc2dbaa5d801ca8fc1e6958f77af2" exitCode=0 Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.433621 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0621-account-create-update-6q629" event={"ID":"6a53ae7b-d679-4ae7-a6c6-d3465781c613","Type":"ContainerDied","Data":"35863c313e57052c1427c3c7d10d10a7664fc2dbaa5d801ca8fc1e6958f77af2"} Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.440484 4728 generic.go:334] "Generic (PLEG): container finished" podID="f62a2680-ed4b-449b-925c-e243731ea8b4" containerID="e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99" exitCode=0 Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.440569 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.440900 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" event={"ID":"f62a2680-ed4b-449b-925c-e243731ea8b4","Type":"ContainerDied","Data":"e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99"} Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.440962 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb45645f7-6b8qk" event={"ID":"f62a2680-ed4b-449b-925c-e243731ea8b4","Type":"ContainerDied","Data":"ba4464f237638a76c94754e8a7dc8662ff2877b2f333e38e5be930f374d69c71"} Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.440982 4728 scope.go:117] "RemoveContainer" containerID="e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.442942 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f62a2680-ed4b-449b-925c-e243731ea8b4" (UID: "f62a2680-ed4b-449b-925c-e243731ea8b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.446645 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f62a2680-ed4b-449b-925c-e243731ea8b4" (UID: "f62a2680-ed4b-449b-925c-e243731ea8b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.448049 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6x4kp-config-6jdqt" event={"ID":"472f53a9-932e-4244-9136-9249f9f0e3ce","Type":"ContainerDied","Data":"d1cf016ab19b462e007b578ba7cacb4df13e80fe321168f95ef76734452bad1e"} Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.448086 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1cf016ab19b462e007b578ba7cacb4df13e80fe321168f95ef76734452bad1e" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.448137 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6x4kp-config-6jdqt" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.449811 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-config" (OuterVolumeSpecName: "config") pod "f62a2680-ed4b-449b-925c-e243731ea8b4" (UID: "f62a2680-ed4b-449b-925c-e243731ea8b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.450151 4728 generic.go:334] "Generic (PLEG): container finished" podID="803f99c7-af4a-4c8a-99ac-42a58563c3d2" containerID="9b16d03f47f8d52a84050392808ba88d387bafab010b302c21dc736bc248968e" exitCode=0 Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.450207 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6e8-account-create-update-pg7cx" event={"ID":"803f99c7-af4a-4c8a-99ac-42a58563c3d2","Type":"ContainerDied","Data":"9b16d03f47f8d52a84050392808ba88d387bafab010b302c21dc736bc248968e"} Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.452894 4728 generic.go:334] "Generic (PLEG): container finished" podID="7319f4f1-86d5-4681-ba6c-012c0f3039ac" containerID="9254c641e6296bc21d55e1dfdc27a045033efaec56fddf85a1c1a79c8777b95a" exitCode=0 Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.453044 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qh5qp" event={"ID":"7319f4f1-86d5-4681-ba6c-012c0f3039ac","Type":"ContainerDied","Data":"9254c641e6296bc21d55e1dfdc27a045033efaec56fddf85a1c1a79c8777b95a"} Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.464839 4728 scope.go:117] "RemoveContainer" containerID="9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.473220 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f62a2680-ed4b-449b-925c-e243731ea8b4" (UID: "f62a2680-ed4b-449b-925c-e243731ea8b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.507256 4728 scope.go:117] "RemoveContainer" containerID="e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99" Jan 25 05:53:52 crc kubenswrapper[4728]: E0125 05:53:52.507597 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99\": container with ID starting with e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99 not found: ID does not exist" containerID="e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.507632 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99"} err="failed to get container status \"e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99\": rpc error: code = NotFound desc = could not find container \"e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99\": container with ID starting with e29a1898d6d3e3002a3b9f0833cb8ca03099dff0ff2d570ffc73529ad8c23f99 not found: ID does not exist" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.507652 4728 scope.go:117] "RemoveContainer" containerID="9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595" Jan 25 05:53:52 crc kubenswrapper[4728]: E0125 05:53:52.507816 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595\": container with ID starting with 9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595 not found: ID does not exist" containerID="9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.507839 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595"} err="failed to get container status \"9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595\": rpc error: code = NotFound desc = could not find container \"9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595\": container with ID starting with 9e4aa238cfc7459ac07e45050a2aff5f976daa06a3119079e89f9646bf972595 not found: ID does not exist" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.508973 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.508999 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.509009 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.509018 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mnjn\" (UniqueName: \"kubernetes.io/projected/f62a2680-ed4b-449b-925c-e243731ea8b4-kube-api-access-9mnjn\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.509027 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62a2680-ed4b-449b-925c-e243731ea8b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.786162 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb45645f7-6b8qk"] Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.803135 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb45645f7-6b8qk"] Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.835064 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.890512 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6x4kp-config-6jdqt"] Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.897726 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6x4kp-config-6jdqt"] Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.906340 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.910388 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.916908 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-operator-scripts\") pod \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\" (UID: \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\") " Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.917124 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjtdp\" (UniqueName: \"kubernetes.io/projected/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-kube-api-access-bjtdp\") pod \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\" (UID: \"be31ea3f-183c-4282-bf47-2fc7e17ab5b0\") " Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.921609 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-kube-api-access-bjtdp" (OuterVolumeSpecName: "kube-api-access-bjtdp") pod "be31ea3f-183c-4282-bf47-2fc7e17ab5b0" (UID: "be31ea3f-183c-4282-bf47-2fc7e17ab5b0"). InnerVolumeSpecName "kube-api-access-bjtdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:52 crc kubenswrapper[4728]: I0125 05:53:52.925746 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be31ea3f-183c-4282-bf47-2fc7e17ab5b0" (UID: "be31ea3f-183c-4282-bf47-2fc7e17ab5b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.018337 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-operator-scripts\") pod \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\" (UID: \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\") " Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.018387 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0995cb23-3429-4757-95d1-7f48216b7dce-operator-scripts\") pod \"0995cb23-3429-4757-95d1-7f48216b7dce\" (UID: \"0995cb23-3429-4757-95d1-7f48216b7dce\") " Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.018414 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mct4m\" (UniqueName: \"kubernetes.io/projected/0995cb23-3429-4757-95d1-7f48216b7dce-kube-api-access-mct4m\") pod \"0995cb23-3429-4757-95d1-7f48216b7dce\" (UID: \"0995cb23-3429-4757-95d1-7f48216b7dce\") " Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.018448 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m88w\" (UniqueName: \"kubernetes.io/projected/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-kube-api-access-2m88w\") pod \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\" (UID: \"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1\") " Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.019074 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0995cb23-3429-4757-95d1-7f48216b7dce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0995cb23-3429-4757-95d1-7f48216b7dce" (UID: "0995cb23-3429-4757-95d1-7f48216b7dce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.019230 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0995cb23-3429-4757-95d1-7f48216b7dce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.019264 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjtdp\" (UniqueName: \"kubernetes.io/projected/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-kube-api-access-bjtdp\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.019275 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be31ea3f-183c-4282-bf47-2fc7e17ab5b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.019476 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d022c478-e9a3-40d9-b37e-54d2ba1e5ba1" (UID: "d022c478-e9a3-40d9-b37e-54d2ba1e5ba1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.021109 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0995cb23-3429-4757-95d1-7f48216b7dce-kube-api-access-mct4m" (OuterVolumeSpecName: "kube-api-access-mct4m") pod "0995cb23-3429-4757-95d1-7f48216b7dce" (UID: "0995cb23-3429-4757-95d1-7f48216b7dce"). InnerVolumeSpecName "kube-api-access-mct4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.021520 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-kube-api-access-2m88w" (OuterVolumeSpecName: "kube-api-access-2m88w") pod "d022c478-e9a3-40d9-b37e-54d2ba1e5ba1" (UID: "d022c478-e9a3-40d9-b37e-54d2ba1e5ba1"). InnerVolumeSpecName "kube-api-access-2m88w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.122129 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.122400 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mct4m\" (UniqueName: \"kubernetes.io/projected/0995cb23-3429-4757-95d1-7f48216b7dce-kube-api-access-mct4m\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.122557 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m88w\" (UniqueName: \"kubernetes.io/projected/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1-kube-api-access-2m88w\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.338338 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472f53a9-932e-4244-9136-9249f9f0e3ce" path="/var/lib/kubelet/pods/472f53a9-932e-4244-9136-9249f9f0e3ce/volumes" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.339076 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62a2680-ed4b-449b-925c-e243731ea8b4" path="/var/lib/kubelet/pods/f62a2680-ed4b-449b-925c-e243731ea8b4/volumes" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.435802 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6x4kp" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.465497 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kq85j" event={"ID":"0995cb23-3429-4757-95d1-7f48216b7dce","Type":"ContainerDied","Data":"1ed8cbcb86beb7f012dc4b0154f895bf33f1bb59226597110569fd0925ef3a5b"} Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.465532 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ed8cbcb86beb7f012dc4b0154f895bf33f1bb59226597110569fd0925ef3a5b" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.465512 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kq85j" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.467651 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dllgm" event={"ID":"d022c478-e9a3-40d9-b37e-54d2ba1e5ba1","Type":"ContainerDied","Data":"18b1dd7560e09c419769d8f41425178a5e03c2b5d171b4debd64a273a34d6336"} Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.467674 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b1dd7560e09c419769d8f41425178a5e03c2b5d171b4debd64a273a34d6336" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.467661 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dllgm" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.469277 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-704e-account-create-update-9vw4c" event={"ID":"be31ea3f-183c-4282-bf47-2fc7e17ab5b0","Type":"ContainerDied","Data":"ffe651d9e38bfeff50e0aa71bb68663b635b52d0786ec4f2dea73202fa89c923"} Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.469298 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffe651d9e38bfeff50e0aa71bb68663b635b52d0786ec4f2dea73202fa89c923" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.469354 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-704e-account-create-update-9vw4c" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.721473 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.812532 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.845761 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803f99c7-af4a-4c8a-99ac-42a58563c3d2-operator-scripts\") pod \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\" (UID: \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\") " Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.845892 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f56fg\" (UniqueName: \"kubernetes.io/projected/803f99c7-af4a-4c8a-99ac-42a58563c3d2-kube-api-access-f56fg\") pod \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\" (UID: \"803f99c7-af4a-4c8a-99ac-42a58563c3d2\") " Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.847220 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803f99c7-af4a-4c8a-99ac-42a58563c3d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "803f99c7-af4a-4c8a-99ac-42a58563c3d2" (UID: "803f99c7-af4a-4c8a-99ac-42a58563c3d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.850132 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803f99c7-af4a-4c8a-99ac-42a58563c3d2-kube-api-access-f56fg" (OuterVolumeSpecName: "kube-api-access-f56fg") pod "803f99c7-af4a-4c8a-99ac-42a58563c3d2" (UID: "803f99c7-af4a-4c8a-99ac-42a58563c3d2"). InnerVolumeSpecName "kube-api-access-f56fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.876918 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.947618 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ml78\" (UniqueName: \"kubernetes.io/projected/7319f4f1-86d5-4681-ba6c-012c0f3039ac-kube-api-access-6ml78\") pod \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\" (UID: \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\") " Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.947793 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7319f4f1-86d5-4681-ba6c-012c0f3039ac-operator-scripts\") pod \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\" (UID: \"7319f4f1-86d5-4681-ba6c-012c0f3039ac\") " Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.948080 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803f99c7-af4a-4c8a-99ac-42a58563c3d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.948099 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f56fg\" (UniqueName: \"kubernetes.io/projected/803f99c7-af4a-4c8a-99ac-42a58563c3d2-kube-api-access-f56fg\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.948204 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7319f4f1-86d5-4681-ba6c-012c0f3039ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7319f4f1-86d5-4681-ba6c-012c0f3039ac" (UID: "7319f4f1-86d5-4681-ba6c-012c0f3039ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:53 crc kubenswrapper[4728]: I0125 05:53:53.951040 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7319f4f1-86d5-4681-ba6c-012c0f3039ac-kube-api-access-6ml78" (OuterVolumeSpecName: "kube-api-access-6ml78") pod "7319f4f1-86d5-4681-ba6c-012c0f3039ac" (UID: "7319f4f1-86d5-4681-ba6c-012c0f3039ac"). InnerVolumeSpecName "kube-api-access-6ml78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.049162 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a53ae7b-d679-4ae7-a6c6-d3465781c613-operator-scripts\") pod \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\" (UID: \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\") " Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.049351 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f96c\" (UniqueName: \"kubernetes.io/projected/6a53ae7b-d679-4ae7-a6c6-d3465781c613-kube-api-access-2f96c\") pod \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\" (UID: \"6a53ae7b-d679-4ae7-a6c6-d3465781c613\") " Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.049667 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7319f4f1-86d5-4681-ba6c-012c0f3039ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.049686 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ml78\" (UniqueName: \"kubernetes.io/projected/7319f4f1-86d5-4681-ba6c-012c0f3039ac-kube-api-access-6ml78\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.049729 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a53ae7b-d679-4ae7-a6c6-d3465781c613-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a53ae7b-d679-4ae7-a6c6-d3465781c613" (UID: "6a53ae7b-d679-4ae7-a6c6-d3465781c613"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.052413 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a53ae7b-d679-4ae7-a6c6-d3465781c613-kube-api-access-2f96c" (OuterVolumeSpecName: "kube-api-access-2f96c") pod "6a53ae7b-d679-4ae7-a6c6-d3465781c613" (UID: "6a53ae7b-d679-4ae7-a6c6-d3465781c613"). InnerVolumeSpecName "kube-api-access-2f96c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.151185 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a53ae7b-d679-4ae7-a6c6-d3465781c613-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.151240 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f96c\" (UniqueName: \"kubernetes.io/projected/6a53ae7b-d679-4ae7-a6c6-d3465781c613-kube-api-access-2f96c\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.482455 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qh5qp" event={"ID":"7319f4f1-86d5-4681-ba6c-012c0f3039ac","Type":"ContainerDied","Data":"6400570f48b7e750e6b971b89dd57594c584c957963f6d709579df06e9c14d7e"} Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.482507 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6400570f48b7e750e6b971b89dd57594c584c957963f6d709579df06e9c14d7e" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.482465 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qh5qp" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.484464 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0621-account-create-update-6q629" event={"ID":"6a53ae7b-d679-4ae7-a6c6-d3465781c613","Type":"ContainerDied","Data":"d3a8d7659c8f4fb982a4c3c83d60717b91ebcd8f267875049c7be8b3f4ef3b62"} Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.484492 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0621-account-create-update-6q629" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.484502 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3a8d7659c8f4fb982a4c3c83d60717b91ebcd8f267875049c7be8b3f4ef3b62" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.485881 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6e8-account-create-update-pg7cx" event={"ID":"803f99c7-af4a-4c8a-99ac-42a58563c3d2","Type":"ContainerDied","Data":"2b67b337d202cde03a0e6b8ada079733cfb181a6e57fa3db2db2af24fb546b64"} Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.485921 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b67b337d202cde03a0e6b8ada079733cfb181a6e57fa3db2db2af24fb546b64" Jan 25 05:53:54 crc kubenswrapper[4728]: I0125 05:53:54.485953 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b6e8-account-create-update-pg7cx" Jan 25 05:53:55 crc kubenswrapper[4728]: I0125 05:53:55.496457 4728 generic.go:334] "Generic (PLEG): container finished" podID="db13ac9b-f01e-42d8-b455-db929ef4b64c" containerID="153c3e929ee527e8d0eefcd225e5a333815034615f90b9d17a201bdf5dc4b832" exitCode=0 Jan 25 05:53:55 crc kubenswrapper[4728]: I0125 05:53:55.496559 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w8ld9" event={"ID":"db13ac9b-f01e-42d8-b455-db929ef4b64c","Type":"ContainerDied","Data":"153c3e929ee527e8d0eefcd225e5a333815034615f90b9d17a201bdf5dc4b832"} Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.176612 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.309444 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-combined-ca-bundle\") pod \"db13ac9b-f01e-42d8-b455-db929ef4b64c\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.309879 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-db-sync-config-data\") pod \"db13ac9b-f01e-42d8-b455-db929ef4b64c\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.309920 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7wzb\" (UniqueName: \"kubernetes.io/projected/db13ac9b-f01e-42d8-b455-db929ef4b64c-kube-api-access-m7wzb\") pod \"db13ac9b-f01e-42d8-b455-db929ef4b64c\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.310020 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-config-data\") pod \"db13ac9b-f01e-42d8-b455-db929ef4b64c\" (UID: \"db13ac9b-f01e-42d8-b455-db929ef4b64c\") " Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.315874 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db13ac9b-f01e-42d8-b455-db929ef4b64c-kube-api-access-m7wzb" (OuterVolumeSpecName: "kube-api-access-m7wzb") pod "db13ac9b-f01e-42d8-b455-db929ef4b64c" (UID: "db13ac9b-f01e-42d8-b455-db929ef4b64c"). InnerVolumeSpecName "kube-api-access-m7wzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.316233 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "db13ac9b-f01e-42d8-b455-db929ef4b64c" (UID: "db13ac9b-f01e-42d8-b455-db929ef4b64c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.345484 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db13ac9b-f01e-42d8-b455-db929ef4b64c" (UID: "db13ac9b-f01e-42d8-b455-db929ef4b64c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.355526 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-config-data" (OuterVolumeSpecName: "config-data") pod "db13ac9b-f01e-42d8-b455-db929ef4b64c" (UID: "db13ac9b-f01e-42d8-b455-db929ef4b64c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.413870 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.413923 4728 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.413938 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7wzb\" (UniqueName: \"kubernetes.io/projected/db13ac9b-f01e-42d8-b455-db929ef4b64c-kube-api-access-m7wzb\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.413950 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db13ac9b-f01e-42d8-b455-db929ef4b64c-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.519275 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-r67j5" event={"ID":"00b9902f-5c07-4a73-8a08-0a1c28e09fd8","Type":"ContainerStarted","Data":"2587cc2bf4bab8cbfe3b128c260ab728490492a762accd349e62fe11a5e81578"} Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.521295 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w8ld9" event={"ID":"db13ac9b-f01e-42d8-b455-db929ef4b64c","Type":"ContainerDied","Data":"4830b16106f88b0a9deda0d5573143bcc9c7ecf5fea9b046f0354f9959dff596"} Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.521370 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4830b16106f88b0a9deda0d5573143bcc9c7ecf5fea9b046f0354f9959dff596" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.521454 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w8ld9" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.550420 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-r67j5" podStartSLOduration=1.607811136 podStartE2EDuration="7.550400177s" podCreationTimestamp="2026-01-25 05:53:50 +0000 UTC" firstStartedPulling="2026-01-25 05:53:51.166247764 +0000 UTC m=+922.202125745" lastFinishedPulling="2026-01-25 05:53:57.108836805 +0000 UTC m=+928.144714786" observedRunningTime="2026-01-25 05:53:57.541684221 +0000 UTC m=+928.577562201" watchObservedRunningTime="2026-01-25 05:53:57.550400177 +0000 UTC m=+928.586278157" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.847906 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68f966b947-jztp5"] Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.848446 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62a2680-ed4b-449b-925c-e243731ea8b4" containerName="dnsmasq-dns" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.848535 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62a2680-ed4b-449b-925c-e243731ea8b4" containerName="dnsmasq-dns" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.848596 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472f53a9-932e-4244-9136-9249f9f0e3ce" containerName="ovn-config" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.848647 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="472f53a9-932e-4244-9136-9249f9f0e3ce" containerName="ovn-config" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.848706 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be31ea3f-183c-4282-bf47-2fc7e17ab5b0" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.848751 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="be31ea3f-183c-4282-bf47-2fc7e17ab5b0" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.848804 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0995cb23-3429-4757-95d1-7f48216b7dce" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.848851 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0995cb23-3429-4757-95d1-7f48216b7dce" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.848899 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803f99c7-af4a-4c8a-99ac-42a58563c3d2" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.848949 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="803f99c7-af4a-4c8a-99ac-42a58563c3d2" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.848997 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d022c478-e9a3-40d9-b37e-54d2ba1e5ba1" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849047 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d022c478-e9a3-40d9-b37e-54d2ba1e5ba1" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.849100 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db13ac9b-f01e-42d8-b455-db929ef4b64c" containerName="glance-db-sync" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849143 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="db13ac9b-f01e-42d8-b455-db929ef4b64c" containerName="glance-db-sync" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.849191 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62a2680-ed4b-449b-925c-e243731ea8b4" containerName="init" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849251 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62a2680-ed4b-449b-925c-e243731ea8b4" containerName="init" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.849297 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a53ae7b-d679-4ae7-a6c6-d3465781c613" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849364 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a53ae7b-d679-4ae7-a6c6-d3465781c613" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: E0125 05:53:57.849422 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7319f4f1-86d5-4681-ba6c-012c0f3039ac" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849479 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7319f4f1-86d5-4681-ba6c-012c0f3039ac" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849687 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="be31ea3f-183c-4282-bf47-2fc7e17ab5b0" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849743 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d022c478-e9a3-40d9-b37e-54d2ba1e5ba1" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849793 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="803f99c7-af4a-4c8a-99ac-42a58563c3d2" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849842 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="db13ac9b-f01e-42d8-b455-db929ef4b64c" containerName="glance-db-sync" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849890 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a53ae7b-d679-4ae7-a6c6-d3465781c613" containerName="mariadb-account-create-update" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.849937 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7319f4f1-86d5-4681-ba6c-012c0f3039ac" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.850063 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="472f53a9-932e-4244-9136-9249f9f0e3ce" containerName="ovn-config" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.850117 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0995cb23-3429-4757-95d1-7f48216b7dce" containerName="mariadb-database-create" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.850171 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62a2680-ed4b-449b-925c-e243731ea8b4" containerName="dnsmasq-dns" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.855559 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:57 crc kubenswrapper[4728]: I0125 05:53:57.862434 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68f966b947-jztp5"] Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.029086 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-config\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.029152 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-nb\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.029223 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-svc\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.029280 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-swift-storage-0\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.029365 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bb6m\" (UniqueName: \"kubernetes.io/projected/39a74ec2-9049-4c8b-a105-85c7768f1928-kube-api-access-5bb6m\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.029474 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-sb\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.130769 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-sb\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.131066 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-config\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.131104 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-nb\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.131132 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-svc\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.131153 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-swift-storage-0\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.131180 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bb6m\" (UniqueName: \"kubernetes.io/projected/39a74ec2-9049-4c8b-a105-85c7768f1928-kube-api-access-5bb6m\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.132159 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-sb\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.132688 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-config\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.133163 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-nb\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.133660 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-svc\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.134127 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-swift-storage-0\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.147046 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bb6m\" (UniqueName: \"kubernetes.io/projected/39a74ec2-9049-4c8b-a105-85c7768f1928-kube-api-access-5bb6m\") pod \"dnsmasq-dns-68f966b947-jztp5\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.168036 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:53:58 crc kubenswrapper[4728]: I0125 05:53:58.597539 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68f966b947-jztp5"] Jan 25 05:53:59 crc kubenswrapper[4728]: I0125 05:53:59.537523 4728 generic.go:334] "Generic (PLEG): container finished" podID="39a74ec2-9049-4c8b-a105-85c7768f1928" containerID="05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c" exitCode=0 Jan 25 05:53:59 crc kubenswrapper[4728]: I0125 05:53:59.537648 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f966b947-jztp5" event={"ID":"39a74ec2-9049-4c8b-a105-85c7768f1928","Type":"ContainerDied","Data":"05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c"} Jan 25 05:53:59 crc kubenswrapper[4728]: I0125 05:53:59.538199 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f966b947-jztp5" event={"ID":"39a74ec2-9049-4c8b-a105-85c7768f1928","Type":"ContainerStarted","Data":"b18a328095961952d27f45920c57dde57fc8e13ad7274e9c3abc604b5912f13f"} Jan 25 05:53:59 crc kubenswrapper[4728]: E0125 05:53:59.837884 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice/crio-0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice/crio-30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00b9902f_5c07_4a73_8a08_0a1c28e09fd8.slice/crio-2587cc2bf4bab8cbfe3b128c260ab728490492a762accd349e62fe11a5e81578.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice\": RecentStats: unable to find data in memory cache]" Jan 25 05:54:00 crc kubenswrapper[4728]: I0125 05:54:00.547661 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f966b947-jztp5" event={"ID":"39a74ec2-9049-4c8b-a105-85c7768f1928","Type":"ContainerStarted","Data":"30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14"} Jan 25 05:54:00 crc kubenswrapper[4728]: I0125 05:54:00.548004 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:54:00 crc kubenswrapper[4728]: I0125 05:54:00.549340 4728 generic.go:334] "Generic (PLEG): container finished" podID="00b9902f-5c07-4a73-8a08-0a1c28e09fd8" containerID="2587cc2bf4bab8cbfe3b128c260ab728490492a762accd349e62fe11a5e81578" exitCode=0 Jan 25 05:54:00 crc kubenswrapper[4728]: I0125 05:54:00.549374 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-r67j5" event={"ID":"00b9902f-5c07-4a73-8a08-0a1c28e09fd8","Type":"ContainerDied","Data":"2587cc2bf4bab8cbfe3b128c260ab728490492a762accd349e62fe11a5e81578"} Jan 25 05:54:00 crc kubenswrapper[4728]: I0125 05:54:00.572594 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68f966b947-jztp5" podStartSLOduration=3.572576095 podStartE2EDuration="3.572576095s" podCreationTimestamp="2026-01-25 05:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:00.565657119 +0000 UTC m=+931.601535099" watchObservedRunningTime="2026-01-25 05:54:00.572576095 +0000 UTC m=+931.608454065" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:01.914048 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r67j5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.112756 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vmtf\" (UniqueName: \"kubernetes.io/projected/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-kube-api-access-4vmtf\") pod \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.113346 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-config-data\") pod \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.113389 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-combined-ca-bundle\") pod \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\" (UID: \"00b9902f-5c07-4a73-8a08-0a1c28e09fd8\") " Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.118526 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-kube-api-access-4vmtf" (OuterVolumeSpecName: "kube-api-access-4vmtf") pod "00b9902f-5c07-4a73-8a08-0a1c28e09fd8" (UID: "00b9902f-5c07-4a73-8a08-0a1c28e09fd8"). InnerVolumeSpecName "kube-api-access-4vmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.136684 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00b9902f-5c07-4a73-8a08-0a1c28e09fd8" (UID: "00b9902f-5c07-4a73-8a08-0a1c28e09fd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.152612 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-config-data" (OuterVolumeSpecName: "config-data") pod "00b9902f-5c07-4a73-8a08-0a1c28e09fd8" (UID: "00b9902f-5c07-4a73-8a08-0a1c28e09fd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.215754 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.215779 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.215791 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vmtf\" (UniqueName: \"kubernetes.io/projected/00b9902f-5c07-4a73-8a08-0a1c28e09fd8-kube-api-access-4vmtf\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.568099 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-r67j5" event={"ID":"00b9902f-5c07-4a73-8a08-0a1c28e09fd8","Type":"ContainerDied","Data":"403fad4e938a44317f5a12d94b831521f764c9ca4da8b6664c47c2e280daf665"} Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.568174 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="403fad4e938a44317f5a12d94b831521f764c9ca4da8b6664c47c2e280daf665" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.568185 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-r67j5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.793961 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68f966b947-jztp5"] Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.794191 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68f966b947-jztp5" podUID="39a74ec2-9049-4c8b-a105-85c7768f1928" containerName="dnsmasq-dns" containerID="cri-o://30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14" gracePeriod=10 Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.814589 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b5f67b999-4czqr"] Jan 25 05:54:02 crc kubenswrapper[4728]: E0125 05:54:02.815145 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b9902f-5c07-4a73-8a08-0a1c28e09fd8" containerName="keystone-db-sync" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.815163 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b9902f-5c07-4a73-8a08-0a1c28e09fd8" containerName="keystone-db-sync" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.815310 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b9902f-5c07-4a73-8a08-0a1c28e09fd8" containerName="keystone-db-sync" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.816039 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.838973 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzjkv\" (UniqueName: \"kubernetes.io/projected/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-kube-api-access-fzjkv\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.839056 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-swift-storage-0\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.839150 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-nb\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.839180 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-svc\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.839248 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-sb\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.839282 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-config\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.848874 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vhbc5"] Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.853004 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.856244 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.856518 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.856675 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.856814 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lk66m" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.857136 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.873192 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vhbc5"] Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.917477 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b5f67b999-4czqr"] Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.942873 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-credential-keys\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.942967 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-config-data\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.942999 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-nb\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943043 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-svc\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943071 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-scripts\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943137 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-combined-ca-bundle\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943161 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-sb\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943182 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqbbc\" (UniqueName: \"kubernetes.io/projected/c4ce264c-d5a9-4521-b233-f107e3ee8871-kube-api-access-rqbbc\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943233 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-config\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943277 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-fernet-keys\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943330 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzjkv\" (UniqueName: \"kubernetes.io/projected/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-kube-api-access-fzjkv\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.943348 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-swift-storage-0\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.944267 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-swift-storage-0\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.952140 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-sb\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.955716 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-config\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.955842 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-nb\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.964910 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-svc\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:02 crc kubenswrapper[4728]: I0125 05:54:02.977230 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzjkv\" (UniqueName: \"kubernetes.io/projected/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-kube-api-access-fzjkv\") pod \"dnsmasq-dns-7b5f67b999-4czqr\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.019380 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.021069 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.024129 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.024349 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.031093 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-csg7p"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.032004 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.033238 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jhblj" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.033677 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.035127 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.042436 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.046935 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-config-data\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.046975 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-combined-ca-bundle\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047011 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7rfn\" (UniqueName: \"kubernetes.io/projected/d6b831ac-72a1-4f11-95c8-b3ee47275501-kube-api-access-q7rfn\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047037 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74d5365c-76ac-4544-b1e2-ae442ee191dd-etc-machine-id\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047055 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-config-data\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047074 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-db-sync-config-data\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-scripts\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047133 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-run-httpd\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047162 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-combined-ca-bundle\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047185 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqbbc\" (UniqueName: \"kubernetes.io/projected/c4ce264c-d5a9-4521-b233-f107e3ee8871-kube-api-access-rqbbc\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047200 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-config-data\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047265 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-fernet-keys\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047290 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-scripts\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047329 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w5p8\" (UniqueName: \"kubernetes.io/projected/74d5365c-76ac-4544-b1e2-ae442ee191dd-kube-api-access-6w5p8\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047353 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-log-httpd\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047374 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-scripts\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047389 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047413 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-credential-keys\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.047445 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.050643 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-scripts\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.051391 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-config-data\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.057712 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-credential-keys\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.057775 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-fernet-keys\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.062855 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-combined-ca-bundle\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.069161 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqbbc\" (UniqueName: \"kubernetes.io/projected/c4ce264c-d5a9-4521-b233-f107e3ee8871-kube-api-access-rqbbc\") pod \"keystone-bootstrap-vhbc5\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.078854 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-csg7p"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.131382 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-pr68r"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.137830 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.148741 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.148909 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.149278 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-run-httpd\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150176 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-config-data\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150286 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-scripts\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150373 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w5p8\" (UniqueName: \"kubernetes.io/projected/74d5365c-76ac-4544-b1e2-ae442ee191dd-kube-api-access-6w5p8\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150436 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-log-httpd\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150512 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-scripts\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150577 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150652 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150726 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-combined-ca-bundle\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150787 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7rfn\" (UniqueName: \"kubernetes.io/projected/d6b831ac-72a1-4f11-95c8-b3ee47275501-kube-api-access-q7rfn\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150848 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74d5365c-76ac-4544-b1e2-ae442ee191dd-etc-machine-id\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150909 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-config-data\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.150966 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-db-sync-config-data\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.152998 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-log-httpd\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.149010 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pf6m5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.149688 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-run-httpd\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.154672 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74d5365c-76ac-4544-b1e2-ae442ee191dd-etc-machine-id\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.157258 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-config-data\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.167839 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.168577 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-config-data\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.170257 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-db-sync-config-data\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.170532 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-combined-ca-bundle\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.170721 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.170905 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-scripts\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.171518 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-scripts\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.179769 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pr68r"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.180399 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w5p8\" (UniqueName: \"kubernetes.io/projected/74d5365c-76ac-4544-b1e2-ae442ee191dd-kube-api-access-6w5p8\") pod \"cinder-db-sync-csg7p\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.186962 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7rfn\" (UniqueName: \"kubernetes.io/projected/d6b831ac-72a1-4f11-95c8-b3ee47275501-kube-api-access-q7rfn\") pod \"ceilometer-0\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.190530 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-72kn4"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.190774 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.191999 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.193388 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ngjgt" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.193809 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.196087 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.221197 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-72kn4"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.248307 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nv6tr"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.249435 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.251797 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.252035 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.252176 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tl8s8" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.252280 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b5f67b999-4czqr"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.252895 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-config-data\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.252959 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzxl\" (UniqueName: \"kubernetes.io/projected/554cda4a-e73e-4f9c-93aa-23c41ef468a5-kube-api-access-grzxl\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.252985 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58nrl\" (UniqueName: \"kubernetes.io/projected/9d85a5b2-cb44-4190-973a-179ad187fd37-kube-api-access-58nrl\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.253032 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-db-sync-config-data\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.253053 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d85a5b2-cb44-4190-973a-179ad187fd37-logs\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.253124 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwdcm\" (UniqueName: \"kubernetes.io/projected/019c7f41-0990-4235-b29d-8d8e08d34af1-kube-api-access-fwdcm\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.254434 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-combined-ca-bundle\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.254531 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-combined-ca-bundle\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.254557 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-scripts\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.254669 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-combined-ca-bundle\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.254689 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-config\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.262140 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nv6tr"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.323485 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8555c65755-z9ttf"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.325527 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.339785 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.347535 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.348059 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8555c65755-z9ttf"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357105 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-combined-ca-bundle\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357139 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-scripts\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357177 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-swift-storage-0\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357202 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tclv5\" (UniqueName: \"kubernetes.io/projected/44d23e17-65a8-4719-a9ca-a69f392fdbf3-kube-api-access-tclv5\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357238 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-config\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357253 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-combined-ca-bundle\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357331 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-config-data\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357364 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzxl\" (UniqueName: \"kubernetes.io/projected/554cda4a-e73e-4f9c-93aa-23c41ef468a5-kube-api-access-grzxl\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357386 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58nrl\" (UniqueName: \"kubernetes.io/projected/9d85a5b2-cb44-4190-973a-179ad187fd37-kube-api-access-58nrl\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357428 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-db-sync-config-data\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357453 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d85a5b2-cb44-4190-973a-179ad187fd37-logs\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357472 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-nb\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357524 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwdcm\" (UniqueName: \"kubernetes.io/projected/019c7f41-0990-4235-b29d-8d8e08d34af1-kube-api-access-fwdcm\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357544 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-svc\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357568 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-sb\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357618 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-config\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.357641 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-combined-ca-bundle\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.358917 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d85a5b2-cb44-4190-973a-179ad187fd37-logs\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.363043 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-combined-ca-bundle\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.364268 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-combined-ca-bundle\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.366090 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-db-sync-config-data\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.366228 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-config\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.367262 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-config-data\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.368116 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-combined-ca-bundle\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.368505 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-scripts\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.385592 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwdcm\" (UniqueName: \"kubernetes.io/projected/019c7f41-0990-4235-b29d-8d8e08d34af1-kube-api-access-fwdcm\") pod \"neutron-db-sync-pr68r\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.386069 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzxl\" (UniqueName: \"kubernetes.io/projected/554cda4a-e73e-4f9c-93aa-23c41ef468a5-kube-api-access-grzxl\") pod \"barbican-db-sync-72kn4\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.389867 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58nrl\" (UniqueName: \"kubernetes.io/projected/9d85a5b2-cb44-4190-973a-179ad187fd37-kube-api-access-58nrl\") pod \"placement-db-sync-nv6tr\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.459198 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-nb\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.459281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-svc\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.459304 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-sb\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.459367 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-config\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.459425 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-swift-storage-0\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.459458 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tclv5\" (UniqueName: \"kubernetes.io/projected/44d23e17-65a8-4719-a9ca-a69f392fdbf3-kube-api-access-tclv5\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.460563 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-nb\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.460735 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-svc\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.461115 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-config\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.462027 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-swift-storage-0\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.462603 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-sb\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.477399 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tclv5\" (UniqueName: \"kubernetes.io/projected/44d23e17-65a8-4719-a9ca-a69f392fdbf3-kube-api-access-tclv5\") pod \"dnsmasq-dns-8555c65755-z9ttf\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.489829 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.496145 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.529933 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.560232 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-nb\") pod \"39a74ec2-9049-4c8b-a105-85c7768f1928\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.560309 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-svc\") pod \"39a74ec2-9049-4c8b-a105-85c7768f1928\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.560359 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bb6m\" (UniqueName: \"kubernetes.io/projected/39a74ec2-9049-4c8b-a105-85c7768f1928-kube-api-access-5bb6m\") pod \"39a74ec2-9049-4c8b-a105-85c7768f1928\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.560424 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-config\") pod \"39a74ec2-9049-4c8b-a105-85c7768f1928\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.560484 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-swift-storage-0\") pod \"39a74ec2-9049-4c8b-a105-85c7768f1928\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.560508 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-sb\") pod \"39a74ec2-9049-4c8b-a105-85c7768f1928\" (UID: \"39a74ec2-9049-4c8b-a105-85c7768f1928\") " Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.565732 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a74ec2-9049-4c8b-a105-85c7768f1928-kube-api-access-5bb6m" (OuterVolumeSpecName: "kube-api-access-5bb6m") pod "39a74ec2-9049-4c8b-a105-85c7768f1928" (UID: "39a74ec2-9049-4c8b-a105-85c7768f1928"). InnerVolumeSpecName "kube-api-access-5bb6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.597482 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.629978 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39a74ec2-9049-4c8b-a105-85c7768f1928" (UID: "39a74ec2-9049-4c8b-a105-85c7768f1928"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.644835 4728 generic.go:334] "Generic (PLEG): container finished" podID="39a74ec2-9049-4c8b-a105-85c7768f1928" containerID="30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14" exitCode=0 Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.645385 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f966b947-jztp5" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.645411 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f966b947-jztp5" event={"ID":"39a74ec2-9049-4c8b-a105-85c7768f1928","Type":"ContainerDied","Data":"30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14"} Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.648916 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f966b947-jztp5" event={"ID":"39a74ec2-9049-4c8b-a105-85c7768f1928","Type":"ContainerDied","Data":"b18a328095961952d27f45920c57dde57fc8e13ad7274e9c3abc604b5912f13f"} Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.648957 4728 scope.go:117] "RemoveContainer" containerID="30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.657586 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39a74ec2-9049-4c8b-a105-85c7768f1928" (UID: "39a74ec2-9049-4c8b-a105-85c7768f1928"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.667786 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-config" (OuterVolumeSpecName: "config") pod "39a74ec2-9049-4c8b-a105-85c7768f1928" (UID: "39a74ec2-9049-4c8b-a105-85c7768f1928"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.669905 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.669992 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.670054 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bb6m\" (UniqueName: \"kubernetes.io/projected/39a74ec2-9049-4c8b-a105-85c7768f1928-kube-api-access-5bb6m\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.670106 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.681911 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.683716 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39a74ec2-9049-4c8b-a105-85c7768f1928" (UID: "39a74ec2-9049-4c8b-a105-85c7768f1928"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.690876 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39a74ec2-9049-4c8b-a105-85c7768f1928" (UID: "39a74ec2-9049-4c8b-a105-85c7768f1928"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.738515 4728 scope.go:117] "RemoveContainer" containerID="05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.772567 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.772595 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a74ec2-9049-4c8b-a105-85c7768f1928-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.791440 4728 scope.go:117] "RemoveContainer" containerID="30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14" Jan 25 05:54:03 crc kubenswrapper[4728]: E0125 05:54:03.792588 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14\": container with ID starting with 30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14 not found: ID does not exist" containerID="30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.792630 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14"} err="failed to get container status \"30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14\": rpc error: code = NotFound desc = could not find container \"30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14\": container with ID starting with 30c2d158b18481bca956204a8fbd2b485f9b5222a8c0ad1cad1520c344c5eb14 not found: ID does not exist" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.792658 4728 scope.go:117] "RemoveContainer" containerID="05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c" Jan 25 05:54:03 crc kubenswrapper[4728]: E0125 05:54:03.792903 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c\": container with ID starting with 05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c not found: ID does not exist" containerID="05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.792919 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c"} err="failed to get container status \"05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c\": rpc error: code = NotFound desc = could not find container \"05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c\": container with ID starting with 05407e4ce6d6c504079e0c0be545de367fa34a261e192f3a37cae21c6fc97a1c not found: ID does not exist" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.830263 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b5f67b999-4czqr"] Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.963974 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:03 crc kubenswrapper[4728]: E0125 05:54:03.964281 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a74ec2-9049-4c8b-a105-85c7768f1928" containerName="dnsmasq-dns" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.964293 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a74ec2-9049-4c8b-a105-85c7768f1928" containerName="dnsmasq-dns" Jan 25 05:54:03 crc kubenswrapper[4728]: E0125 05:54:03.964300 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a74ec2-9049-4c8b-a105-85c7768f1928" containerName="init" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.964305 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a74ec2-9049-4c8b-a105-85c7768f1928" containerName="init" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.964481 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a74ec2-9049-4c8b-a105-85c7768f1928" containerName="dnsmasq-dns" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.965247 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.983983 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jscv9" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.984232 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.984478 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.984796 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 25 05:54:03 crc kubenswrapper[4728]: I0125 05:54:03.999010 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.026383 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vhbc5"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.055402 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.059645 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.062682 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.062946 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.063402 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.084775 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.085906 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.085951 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.085981 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.086012 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbgq5\" (UniqueName: \"kubernetes.io/projected/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-kube-api-access-kbgq5\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.086036 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.086086 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.086120 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-logs\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.107121 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-csg7p"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.135246 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.187736 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.187887 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.187993 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.188087 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.188200 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.188919 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.189032 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.189241 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.189688 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190168 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbgq5\" (UniqueName: \"kubernetes.io/projected/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-kube-api-access-kbgq5\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190236 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190264 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190301 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190350 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5c97\" (UniqueName: \"kubernetes.io/projected/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-kube-api-access-n5c97\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190371 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190469 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190740 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.190879 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-logs\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.191228 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-logs\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.191644 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.195301 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.199060 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.204768 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68f966b947-jztp5"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.206404 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbgq5\" (UniqueName: \"kubernetes.io/projected/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-kube-api-access-kbgq5\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.209619 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68f966b947-jztp5"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.214115 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.236855 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.283093 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-72kn4"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.292273 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.292306 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.292745 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.292834 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.293248 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.293283 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.293303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5c97\" (UniqueName: \"kubernetes.io/projected/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-kube-api-access-n5c97\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.293664 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.293708 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.294765 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.295201 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.297957 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.298272 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.300183 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.300485 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.312893 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pr68r"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.313684 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5c97\" (UniqueName: \"kubernetes.io/projected/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-kube-api-access-n5c97\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: W0125 05:54:04.316622 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod019c7f41_0990_4235_b29d_8d8e08d34af1.slice/crio-5e825ebf491278e91f53431df679230f21c0ef962d5cf85569b8eeef967c4445 WatchSource:0}: Error finding container 5e825ebf491278e91f53431df679230f21c0ef962d5cf85569b8eeef967c4445: Status 404 returned error can't find the container with id 5e825ebf491278e91f53431df679230f21c0ef962d5cf85569b8eeef967c4445 Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.316946 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.426191 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8555c65755-z9ttf"] Jan 25 05:54:04 crc kubenswrapper[4728]: W0125 05:54:04.429932 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44d23e17_65a8_4719_a9ca_a69f392fdbf3.slice/crio-2bc82c19f5edb4b92db97c268f72026b3b6633dc886b7e3b45f68135bfcd0129 WatchSource:0}: Error finding container 2bc82c19f5edb4b92db97c268f72026b3b6633dc886b7e3b45f68135bfcd0129: Status 404 returned error can't find the container with id 2bc82c19f5edb4b92db97c268f72026b3b6633dc886b7e3b45f68135bfcd0129 Jan 25 05:54:04 crc kubenswrapper[4728]: W0125 05:54:04.434769 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d85a5b2_cb44_4190_973a_179ad187fd37.slice/crio-07aa458a4a3bf541541ed2897b42e63c87458423a2ea34da4acf3d11ea5b3146 WatchSource:0}: Error finding container 07aa458a4a3bf541541ed2897b42e63c87458423a2ea34da4acf3d11ea5b3146: Status 404 returned error can't find the container with id 07aa458a4a3bf541541ed2897b42e63c87458423a2ea34da4acf3d11ea5b3146 Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.435173 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nv6tr"] Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.504187 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.511740 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.671589 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" event={"ID":"cfdc2c59-dcf7-4617-9f6e-6024dbb39677","Type":"ContainerStarted","Data":"e35c1f0625ed3b301ab08013fc042cfa5b1fed4c645eb5fa7abbc8a2858e6f60"} Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.673187 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" event={"ID":"44d23e17-65a8-4719-a9ca-a69f392fdbf3","Type":"ContainerStarted","Data":"2bc82c19f5edb4b92db97c268f72026b3b6633dc886b7e3b45f68135bfcd0129"} Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.674271 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerStarted","Data":"3663d4d024db7e0849d4842ccdfeaccd60fddde19df6b9e621ca38c761d1ea1c"} Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.675866 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pr68r" event={"ID":"019c7f41-0990-4235-b29d-8d8e08d34af1","Type":"ContainerStarted","Data":"5e825ebf491278e91f53431df679230f21c0ef962d5cf85569b8eeef967c4445"} Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.676691 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-csg7p" event={"ID":"74d5365c-76ac-4544-b1e2-ae442ee191dd","Type":"ContainerStarted","Data":"3fcd1b578ab4fe91fdc5e10396004dc08ababf04ff2de627fbef6aaa1902e2aa"} Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.678247 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nv6tr" event={"ID":"9d85a5b2-cb44-4190-973a-179ad187fd37","Type":"ContainerStarted","Data":"07aa458a4a3bf541541ed2897b42e63c87458423a2ea34da4acf3d11ea5b3146"} Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.678931 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vhbc5" event={"ID":"c4ce264c-d5a9-4521-b233-f107e3ee8871","Type":"ContainerStarted","Data":"74d3e0ec43957a79683bf9f5b59eae3211f3707be78c23fa32ce9d04231e9204"} Jan 25 05:54:04 crc kubenswrapper[4728]: I0125 05:54:04.681195 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-72kn4" event={"ID":"554cda4a-e73e-4f9c-93aa-23c41ef468a5","Type":"ContainerStarted","Data":"c713be16219343ff728926c82a077f873b9006ecd13b18683f9868fda2231efe"} Jan 25 05:54:05 crc kubenswrapper[4728]: I0125 05:54:05.026295 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:05 crc kubenswrapper[4728]: I0125 05:54:05.045713 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:05 crc kubenswrapper[4728]: I0125 05:54:05.065077 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:05 crc kubenswrapper[4728]: I0125 05:54:05.126090 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:05 crc kubenswrapper[4728]: I0125 05:54:05.159988 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:05 crc kubenswrapper[4728]: W0125 05:54:05.173571 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f6ba592_a2c6_4e4f_98dc_fbfdf181bd77.slice/crio-33f4c2e40ab4b190304f6efa20d1a03a00797c52cbb71c003becad74b0199aaf WatchSource:0}: Error finding container 33f4c2e40ab4b190304f6efa20d1a03a00797c52cbb71c003becad74b0199aaf: Status 404 returned error can't find the container with id 33f4c2e40ab4b190304f6efa20d1a03a00797c52cbb71c003becad74b0199aaf Jan 25 05:54:05 crc kubenswrapper[4728]: I0125 05:54:05.355493 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a74ec2-9049-4c8b-a105-85c7768f1928" path="/var/lib/kubelet/pods/39a74ec2-9049-4c8b-a105-85c7768f1928/volumes" Jan 25 05:54:05 crc kubenswrapper[4728]: I0125 05:54:05.715469 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77","Type":"ContainerStarted","Data":"33f4c2e40ab4b190304f6efa20d1a03a00797c52cbb71c003becad74b0199aaf"} Jan 25 05:54:05 crc kubenswrapper[4728]: I0125 05:54:05.726522 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea89653a-610b-4f1c-ba3b-fc3454c2a11b","Type":"ContainerStarted","Data":"fec6cf4939b5344036eac2a94e4772a2c57cab03b813d0b0de6b652545b54c94"} Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.737493 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea89653a-610b-4f1c-ba3b-fc3454c2a11b","Type":"ContainerStarted","Data":"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5"} Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.740947 4728 generic.go:334] "Generic (PLEG): container finished" podID="cfdc2c59-dcf7-4617-9f6e-6024dbb39677" containerID="c47dbad92ab39b66d551e84198ab7485a1f64434f16d5cb12ccb7dc13cb0e745" exitCode=0 Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.741023 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" event={"ID":"cfdc2c59-dcf7-4617-9f6e-6024dbb39677","Type":"ContainerDied","Data":"c47dbad92ab39b66d551e84198ab7485a1f64434f16d5cb12ccb7dc13cb0e745"} Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.744053 4728 generic.go:334] "Generic (PLEG): container finished" podID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerID="0ae8b94e04250731499178bf826d25612cf99df7a25abe2dde637680e4f7b3c3" exitCode=0 Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.744092 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" event={"ID":"44d23e17-65a8-4719-a9ca-a69f392fdbf3","Type":"ContainerDied","Data":"0ae8b94e04250731499178bf826d25612cf99df7a25abe2dde637680e4f7b3c3"} Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.748559 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77","Type":"ContainerStarted","Data":"c495a7f15f59212d489ad77b5670c644939fc7c28a041aa53fa8eebc4dee7adf"} Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.750428 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vhbc5" event={"ID":"c4ce264c-d5a9-4521-b233-f107e3ee8871","Type":"ContainerStarted","Data":"3f2a317de61da75b817d6e155322b6bc5f4b316dcebdaeaa17a12b0dcb7c813e"} Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.773141 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pr68r" event={"ID":"019c7f41-0990-4235-b29d-8d8e08d34af1","Type":"ContainerStarted","Data":"7468c23e4107a46410947d49470ab02fe6f1c86036e819b2bee3284d6485c791"} Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.814541 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vhbc5" podStartSLOduration=4.814523933 podStartE2EDuration="4.814523933s" podCreationTimestamp="2026-01-25 05:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:06.797449617 +0000 UTC m=+937.833327597" watchObservedRunningTime="2026-01-25 05:54:06.814523933 +0000 UTC m=+937.850401913" Jan 25 05:54:06 crc kubenswrapper[4728]: I0125 05:54:06.815992 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-pr68r" podStartSLOduration=3.81598606 podStartE2EDuration="3.81598606s" podCreationTimestamp="2026-01-25 05:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:06.813130976 +0000 UTC m=+937.849008957" watchObservedRunningTime="2026-01-25 05:54:06.81598606 +0000 UTC m=+937.851864040" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.050699 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.145194 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-config\") pod \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.145574 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzjkv\" (UniqueName: \"kubernetes.io/projected/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-kube-api-access-fzjkv\") pod \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.145627 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-svc\") pod \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.145664 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-nb\") pod \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.145766 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-sb\") pod \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.145882 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-swift-storage-0\") pod \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\" (UID: \"cfdc2c59-dcf7-4617-9f6e-6024dbb39677\") " Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.152922 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-kube-api-access-fzjkv" (OuterVolumeSpecName: "kube-api-access-fzjkv") pod "cfdc2c59-dcf7-4617-9f6e-6024dbb39677" (UID: "cfdc2c59-dcf7-4617-9f6e-6024dbb39677"). InnerVolumeSpecName "kube-api-access-fzjkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.162920 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-config" (OuterVolumeSpecName: "config") pod "cfdc2c59-dcf7-4617-9f6e-6024dbb39677" (UID: "cfdc2c59-dcf7-4617-9f6e-6024dbb39677"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.166173 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfdc2c59-dcf7-4617-9f6e-6024dbb39677" (UID: "cfdc2c59-dcf7-4617-9f6e-6024dbb39677"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.169762 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfdc2c59-dcf7-4617-9f6e-6024dbb39677" (UID: "cfdc2c59-dcf7-4617-9f6e-6024dbb39677"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.173811 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cfdc2c59-dcf7-4617-9f6e-6024dbb39677" (UID: "cfdc2c59-dcf7-4617-9f6e-6024dbb39677"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.174059 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfdc2c59-dcf7-4617-9f6e-6024dbb39677" (UID: "cfdc2c59-dcf7-4617-9f6e-6024dbb39677"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.248148 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzjkv\" (UniqueName: \"kubernetes.io/projected/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-kube-api-access-fzjkv\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.248186 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.248201 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.248211 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.248222 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.248231 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdc2c59-dcf7-4617-9f6e-6024dbb39677-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.783520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" event={"ID":"44d23e17-65a8-4719-a9ca-a69f392fdbf3","Type":"ContainerStarted","Data":"05bd7b472dbb0bc6510904e0b5f17ed1c408264ec0244c3f3aab2db674107e61"} Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.783941 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.785532 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77","Type":"ContainerStarted","Data":"839b651ed2bf0e19db19a6d10dc9b1a3c5f81bec38ce35088fc0504b7c9b36a7"} Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.785656 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerName="glance-log" containerID="cri-o://c495a7f15f59212d489ad77b5670c644939fc7c28a041aa53fa8eebc4dee7adf" gracePeriod=30 Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.785744 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerName="glance-httpd" containerID="cri-o://839b651ed2bf0e19db19a6d10dc9b1a3c5f81bec38ce35088fc0504b7c9b36a7" gracePeriod=30 Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.788284 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea89653a-610b-4f1c-ba3b-fc3454c2a11b","Type":"ContainerStarted","Data":"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763"} Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.788389 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerName="glance-log" containerID="cri-o://396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5" gracePeriod=30 Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.788468 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerName="glance-httpd" containerID="cri-o://a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763" gracePeriod=30 Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.790231 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.791397 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b5f67b999-4czqr" event={"ID":"cfdc2c59-dcf7-4617-9f6e-6024dbb39677","Type":"ContainerDied","Data":"e35c1f0625ed3b301ab08013fc042cfa5b1fed4c645eb5fa7abbc8a2858e6f60"} Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.791427 4728 scope.go:117] "RemoveContainer" containerID="c47dbad92ab39b66d551e84198ab7485a1f64434f16d5cb12ccb7dc13cb0e745" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.805969 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" podStartSLOduration=4.805961769 podStartE2EDuration="4.805961769s" podCreationTimestamp="2026-01-25 05:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:07.798978591 +0000 UTC m=+938.834856571" watchObservedRunningTime="2026-01-25 05:54:07.805961769 +0000 UTC m=+938.841839749" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.856364 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b5f67b999-4czqr"] Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.872883 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.87285814 podStartE2EDuration="5.87285814s" podCreationTimestamp="2026-01-25 05:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:07.861548053 +0000 UTC m=+938.897426034" watchObservedRunningTime="2026-01-25 05:54:07.87285814 +0000 UTC m=+938.908736121" Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.873366 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b5f67b999-4czqr"] Jan 25 05:54:07 crc kubenswrapper[4728]: I0125 05:54:07.896982 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.89696663 podStartE2EDuration="5.89696663s" podCreationTimestamp="2026-01-25 05:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:07.879128684 +0000 UTC m=+938.915006674" watchObservedRunningTime="2026-01-25 05:54:07.89696663 +0000 UTC m=+938.932844610" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.322551 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.473740 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-config-data\") pod \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.473791 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-combined-ca-bundle\") pod \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.473880 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-public-tls-certs\") pod \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.473953 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-scripts\") pod \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.474014 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.474174 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-logs\") pod \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.474203 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbgq5\" (UniqueName: \"kubernetes.io/projected/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-kube-api-access-kbgq5\") pod \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.474282 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-httpd-run\") pod \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\" (UID: \"ea89653a-610b-4f1c-ba3b-fc3454c2a11b\") " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.474893 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-logs" (OuterVolumeSpecName: "logs") pod "ea89653a-610b-4f1c-ba3b-fc3454c2a11b" (UID: "ea89653a-610b-4f1c-ba3b-fc3454c2a11b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.475224 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ea89653a-610b-4f1c-ba3b-fc3454c2a11b" (UID: "ea89653a-610b-4f1c-ba3b-fc3454c2a11b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.476610 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.476631 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.479358 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-scripts" (OuterVolumeSpecName: "scripts") pod "ea89653a-610b-4f1c-ba3b-fc3454c2a11b" (UID: "ea89653a-610b-4f1c-ba3b-fc3454c2a11b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.484169 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-kube-api-access-kbgq5" (OuterVolumeSpecName: "kube-api-access-kbgq5") pod "ea89653a-610b-4f1c-ba3b-fc3454c2a11b" (UID: "ea89653a-610b-4f1c-ba3b-fc3454c2a11b"). InnerVolumeSpecName "kube-api-access-kbgq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.484380 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "ea89653a-610b-4f1c-ba3b-fc3454c2a11b" (UID: "ea89653a-610b-4f1c-ba3b-fc3454c2a11b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.498437 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea89653a-610b-4f1c-ba3b-fc3454c2a11b" (UID: "ea89653a-610b-4f1c-ba3b-fc3454c2a11b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.515693 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-config-data" (OuterVolumeSpecName: "config-data") pod "ea89653a-610b-4f1c-ba3b-fc3454c2a11b" (UID: "ea89653a-610b-4f1c-ba3b-fc3454c2a11b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.517670 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ea89653a-610b-4f1c-ba3b-fc3454c2a11b" (UID: "ea89653a-610b-4f1c-ba3b-fc3454c2a11b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.578657 4728 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.578690 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.578722 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.578734 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbgq5\" (UniqueName: \"kubernetes.io/projected/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-kube-api-access-kbgq5\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.578747 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.578757 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89653a-610b-4f1c-ba3b-fc3454c2a11b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.595055 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.682933 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.809909 4728 generic.go:334] "Generic (PLEG): container finished" podID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerID="a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763" exitCode=0 Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.812060 4728 generic.go:334] "Generic (PLEG): container finished" podID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerID="396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5" exitCode=143 Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.809967 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.809984 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea89653a-610b-4f1c-ba3b-fc3454c2a11b","Type":"ContainerDied","Data":"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763"} Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.812849 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea89653a-610b-4f1c-ba3b-fc3454c2a11b","Type":"ContainerDied","Data":"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5"} Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.812877 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea89653a-610b-4f1c-ba3b-fc3454c2a11b","Type":"ContainerDied","Data":"fec6cf4939b5344036eac2a94e4772a2c57cab03b813d0b0de6b652545b54c94"} Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.812906 4728 scope.go:117] "RemoveContainer" containerID="a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.833504 4728 generic.go:334] "Generic (PLEG): container finished" podID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerID="839b651ed2bf0e19db19a6d10dc9b1a3c5f81bec38ce35088fc0504b7c9b36a7" exitCode=0 Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.833540 4728 generic.go:334] "Generic (PLEG): container finished" podID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerID="c495a7f15f59212d489ad77b5670c644939fc7c28a041aa53fa8eebc4dee7adf" exitCode=143 Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.833939 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77","Type":"ContainerDied","Data":"839b651ed2bf0e19db19a6d10dc9b1a3c5f81bec38ce35088fc0504b7c9b36a7"} Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.833972 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77","Type":"ContainerDied","Data":"c495a7f15f59212d489ad77b5670c644939fc7c28a041aa53fa8eebc4dee7adf"} Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.855113 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.870499 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.878017 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:08 crc kubenswrapper[4728]: E0125 05:54:08.878512 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerName="glance-httpd" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.878528 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerName="glance-httpd" Jan 25 05:54:08 crc kubenswrapper[4728]: E0125 05:54:08.878562 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerName="glance-log" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.878574 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerName="glance-log" Jan 25 05:54:08 crc kubenswrapper[4728]: E0125 05:54:08.878614 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfdc2c59-dcf7-4617-9f6e-6024dbb39677" containerName="init" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.878622 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfdc2c59-dcf7-4617-9f6e-6024dbb39677" containerName="init" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.878857 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfdc2c59-dcf7-4617-9f6e-6024dbb39677" containerName="init" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.878878 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerName="glance-log" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.878888 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" containerName="glance-httpd" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.879832 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.882281 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.882504 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.884750 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.994282 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.994574 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-config-data\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.994593 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-logs\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.994646 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-scripts\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.994683 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx7hd\" (UniqueName: \"kubernetes.io/projected/3989d07c-292d-40ec-ac11-ffce42ffde68-kube-api-access-gx7hd\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.994706 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.994812 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:08 crc kubenswrapper[4728]: I0125 05:54:08.994860 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096242 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096287 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096307 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096380 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096402 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-config-data\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096419 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-logs\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096471 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-scripts\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096512 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx7hd\" (UniqueName: \"kubernetes.io/projected/3989d07c-292d-40ec-ac11-ffce42ffde68-kube-api-access-gx7hd\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096695 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.096452 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.097039 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-logs\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.105100 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.108773 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-config-data\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.112064 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx7hd\" (UniqueName: \"kubernetes.io/projected/3989d07c-292d-40ec-ac11-ffce42ffde68-kube-api-access-gx7hd\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.112420 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.112973 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-scripts\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.118825 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.199411 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.338429 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfdc2c59-dcf7-4617-9f6e-6024dbb39677" path="/var/lib/kubelet/pods/cfdc2c59-dcf7-4617-9f6e-6024dbb39677/volumes" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.339066 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea89653a-610b-4f1c-ba3b-fc3454c2a11b" path="/var/lib/kubelet/pods/ea89653a-610b-4f1c-ba3b-fc3454c2a11b/volumes" Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.845193 4728 generic.go:334] "Generic (PLEG): container finished" podID="c4ce264c-d5a9-4521-b233-f107e3ee8871" containerID="3f2a317de61da75b817d6e155322b6bc5f4b316dcebdaeaa17a12b0dcb7c813e" exitCode=0 Jan 25 05:54:09 crc kubenswrapper[4728]: I0125 05:54:09.845284 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vhbc5" event={"ID":"c4ce264c-d5a9-4521-b233-f107e3ee8871","Type":"ContainerDied","Data":"3f2a317de61da75b817d6e155322b6bc5f4b316dcebdaeaa17a12b0dcb7c813e"} Jan 25 05:54:10 crc kubenswrapper[4728]: E0125 05:54:10.064901 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice/crio-0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice/crio-30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8\": RecentStats: unable to find data in memory cache]" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.847037 4728 scope.go:117] "RemoveContainer" containerID="396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.873074 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77","Type":"ContainerDied","Data":"33f4c2e40ab4b190304f6efa20d1a03a00797c52cbb71c003becad74b0199aaf"} Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.873127 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33f4c2e40ab4b190304f6efa20d1a03a00797c52cbb71c003becad74b0199aaf" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.966013 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.978871 4728 scope.go:117] "RemoveContainer" containerID="a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763" Jan 25 05:54:10 crc kubenswrapper[4728]: E0125 05:54:10.979304 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763\": container with ID starting with a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763 not found: ID does not exist" containerID="a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.979344 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763"} err="failed to get container status \"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763\": rpc error: code = NotFound desc = could not find container \"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763\": container with ID starting with a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763 not found: ID does not exist" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.979362 4728 scope.go:117] "RemoveContainer" containerID="396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5" Jan 25 05:54:10 crc kubenswrapper[4728]: E0125 05:54:10.979589 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5\": container with ID starting with 396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5 not found: ID does not exist" containerID="396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.979608 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5"} err="failed to get container status \"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5\": rpc error: code = NotFound desc = could not find container \"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5\": container with ID starting with 396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5 not found: ID does not exist" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.979620 4728 scope.go:117] "RemoveContainer" containerID="a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.979809 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763"} err="failed to get container status \"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763\": rpc error: code = NotFound desc = could not find container \"a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763\": container with ID starting with a112a30793ad6f7fae1f11e590cc185415c3f55c7b81c9ec1570cdbf8ae71763 not found: ID does not exist" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.979825 4728 scope.go:117] "RemoveContainer" containerID="396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5" Jan 25 05:54:10 crc kubenswrapper[4728]: I0125 05:54:10.980016 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5"} err="failed to get container status \"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5\": rpc error: code = NotFound desc = could not find container \"396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5\": container with ID starting with 396d0250e65ae2b850f9b2b6842b956d88e275d8d038c4e4db654349f02abda5 not found: ID does not exist" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.051902 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-config-data\") pod \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.051969 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-internal-tls-certs\") pod \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.052011 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5c97\" (UniqueName: \"kubernetes.io/projected/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-kube-api-access-n5c97\") pod \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.052045 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.052095 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-combined-ca-bundle\") pod \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.052189 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-logs\") pod \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.052238 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-scripts\") pod \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.052291 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-httpd-run\") pod \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\" (UID: \"5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.052668 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-logs" (OuterVolumeSpecName: "logs") pod "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" (UID: "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.052837 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" (UID: "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.053526 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.056543 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-scripts" (OuterVolumeSpecName: "scripts") pod "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" (UID: "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.057680 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-kube-api-access-n5c97" (OuterVolumeSpecName: "kube-api-access-n5c97") pod "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" (UID: "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77"). InnerVolumeSpecName "kube-api-access-n5c97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.062447 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" (UID: "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.112201 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.114426 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" (UID: "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.120951 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" (UID: "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.130453 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-config-data" (OuterVolumeSpecName: "config-data") pod "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" (UID: "5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.155387 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.155436 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.155448 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.155456 4728 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.155478 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5c97\" (UniqueName: \"kubernetes.io/projected/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-kube-api-access-n5c97\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.155529 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.155539 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.170526 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.256751 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-credential-keys\") pod \"c4ce264c-d5a9-4521-b233-f107e3ee8871\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.256800 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-fernet-keys\") pod \"c4ce264c-d5a9-4521-b233-f107e3ee8871\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.256895 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-config-data\") pod \"c4ce264c-d5a9-4521-b233-f107e3ee8871\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.256970 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-combined-ca-bundle\") pod \"c4ce264c-d5a9-4521-b233-f107e3ee8871\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.257005 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-scripts\") pod \"c4ce264c-d5a9-4521-b233-f107e3ee8871\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.257033 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqbbc\" (UniqueName: \"kubernetes.io/projected/c4ce264c-d5a9-4521-b233-f107e3ee8871-kube-api-access-rqbbc\") pod \"c4ce264c-d5a9-4521-b233-f107e3ee8871\" (UID: \"c4ce264c-d5a9-4521-b233-f107e3ee8871\") " Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.257340 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.261007 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ce264c-d5a9-4521-b233-f107e3ee8871-kube-api-access-rqbbc" (OuterVolumeSpecName: "kube-api-access-rqbbc") pod "c4ce264c-d5a9-4521-b233-f107e3ee8871" (UID: "c4ce264c-d5a9-4521-b233-f107e3ee8871"). InnerVolumeSpecName "kube-api-access-rqbbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.261808 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-scripts" (OuterVolumeSpecName: "scripts") pod "c4ce264c-d5a9-4521-b233-f107e3ee8871" (UID: "c4ce264c-d5a9-4521-b233-f107e3ee8871"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.261978 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c4ce264c-d5a9-4521-b233-f107e3ee8871" (UID: "c4ce264c-d5a9-4521-b233-f107e3ee8871"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.262205 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c4ce264c-d5a9-4521-b233-f107e3ee8871" (UID: "c4ce264c-d5a9-4521-b233-f107e3ee8871"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.276115 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-config-data" (OuterVolumeSpecName: "config-data") pod "c4ce264c-d5a9-4521-b233-f107e3ee8871" (UID: "c4ce264c-d5a9-4521-b233-f107e3ee8871"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.276930 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4ce264c-d5a9-4521-b233-f107e3ee8871" (UID: "c4ce264c-d5a9-4521-b233-f107e3ee8871"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.359273 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.359303 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.359330 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.359340 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqbbc\" (UniqueName: \"kubernetes.io/projected/c4ce264c-d5a9-4521-b233-f107e3ee8871-kube-api-access-rqbbc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.359350 4728 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.359358 4728 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4ce264c-d5a9-4521-b233-f107e3ee8871-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.373180 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:54:11 crc kubenswrapper[4728]: W0125 05:54:11.383490 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3989d07c_292d_40ec_ac11_ffce42ffde68.slice/crio-19bc42390f2c1504c99588b1d8e0829bad251a73e59fee27cac498423f9d52a7 WatchSource:0}: Error finding container 19bc42390f2c1504c99588b1d8e0829bad251a73e59fee27cac498423f9d52a7: Status 404 returned error can't find the container with id 19bc42390f2c1504c99588b1d8e0829bad251a73e59fee27cac498423f9d52a7 Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.896365 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3989d07c-292d-40ec-ac11-ffce42ffde68","Type":"ContainerStarted","Data":"a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae"} Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.896638 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3989d07c-292d-40ec-ac11-ffce42ffde68","Type":"ContainerStarted","Data":"19bc42390f2c1504c99588b1d8e0829bad251a73e59fee27cac498423f9d52a7"} Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.897882 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nv6tr" event={"ID":"9d85a5b2-cb44-4190-973a-179ad187fd37","Type":"ContainerStarted","Data":"e21ea8a945a43884e58d6ac8b724dff7592965bb8c6506aaecc4d8213e8f890f"} Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.902975 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.903476 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vhbc5" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.903579 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vhbc5" event={"ID":"c4ce264c-d5a9-4521-b233-f107e3ee8871","Type":"ContainerDied","Data":"74d3e0ec43957a79683bf9f5b59eae3211f3707be78c23fa32ce9d04231e9204"} Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.903617 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d3e0ec43957a79683bf9f5b59eae3211f3707be78c23fa32ce9d04231e9204" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.938884 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nv6tr" podStartSLOduration=2.480711137 podStartE2EDuration="8.938872853s" podCreationTimestamp="2026-01-25 05:54:03 +0000 UTC" firstStartedPulling="2026-01-25 05:54:04.436742167 +0000 UTC m=+935.472620147" lastFinishedPulling="2026-01-25 05:54:10.894903883 +0000 UTC m=+941.930781863" observedRunningTime="2026-01-25 05:54:11.920636216 +0000 UTC m=+942.956514195" watchObservedRunningTime="2026-01-25 05:54:11.938872853 +0000 UTC m=+942.974750833" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.940870 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.948817 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.958525 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:11 crc kubenswrapper[4728]: E0125 05:54:11.958821 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerName="glance-log" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.958834 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerName="glance-log" Jan 25 05:54:11 crc kubenswrapper[4728]: E0125 05:54:11.958855 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerName="glance-httpd" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.958861 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerName="glance-httpd" Jan 25 05:54:11 crc kubenswrapper[4728]: E0125 05:54:11.958875 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ce264c-d5a9-4521-b233-f107e3ee8871" containerName="keystone-bootstrap" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.958881 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ce264c-d5a9-4521-b233-f107e3ee8871" containerName="keystone-bootstrap" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.959012 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ce264c-d5a9-4521-b233-f107e3ee8871" containerName="keystone-bootstrap" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.959026 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerName="glance-log" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.959033 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" containerName="glance-httpd" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.959906 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.962610 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.970338 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 25 05:54:11 crc kubenswrapper[4728]: I0125 05:54:11.978094 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.009012 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vhbc5"] Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.025427 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vhbc5"] Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.073902 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.073970 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.074021 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.074107 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.074148 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6xqd\" (UniqueName: \"kubernetes.io/projected/de462b89-925b-42f0-9590-a93b2081cc41-kube-api-access-r6xqd\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.074167 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.074219 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-logs\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.074306 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.102241 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sbl2s"] Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.103914 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.107724 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.107924 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lk66m" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.108315 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.108570 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.108763 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.110976 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sbl2s"] Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.176311 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6xqd\" (UniqueName: \"kubernetes.io/projected/de462b89-925b-42f0-9590-a93b2081cc41-kube-api-access-r6xqd\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.176678 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.176703 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-logs\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177248 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-logs\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177524 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-config-data\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177564 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-fernet-keys\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177609 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177746 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-combined-ca-bundle\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177806 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177830 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177882 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177908 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2qfp\" (UniqueName: \"kubernetes.io/projected/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-kube-api-access-v2qfp\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177941 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177966 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-scripts\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.177990 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.178028 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-credential-keys\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.179134 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.182495 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.189887 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.190567 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.192135 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.193592 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6xqd\" (UniqueName: \"kubernetes.io/projected/de462b89-925b-42f0-9590-a93b2081cc41-kube-api-access-r6xqd\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.203132 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.280688 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-combined-ca-bundle\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.280861 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2qfp\" (UniqueName: \"kubernetes.io/projected/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-kube-api-access-v2qfp\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.280936 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-scripts\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.280995 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-credential-keys\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.281076 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-config-data\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.281130 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-fernet-keys\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.285108 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-config-data\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.285921 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-combined-ca-bundle\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.288248 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-fernet-keys\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.288583 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-scripts\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.290972 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.296756 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-credential-keys\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.298822 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2qfp\" (UniqueName: \"kubernetes.io/projected/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-kube-api-access-v2qfp\") pod \"keystone-bootstrap-sbl2s\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.422674 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.762078 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:54:12 crc kubenswrapper[4728]: W0125 05:54:12.801923 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde462b89_925b_42f0_9590_a93b2081cc41.slice/crio-73bd08adb260d4d7e2fdf8ca7e14d40eb8528b4f961c790504608541f17632a1 WatchSource:0}: Error finding container 73bd08adb260d4d7e2fdf8ca7e14d40eb8528b4f961c790504608541f17632a1: Status 404 returned error can't find the container with id 73bd08adb260d4d7e2fdf8ca7e14d40eb8528b4f961c790504608541f17632a1 Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.858160 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sbl2s"] Jan 25 05:54:12 crc kubenswrapper[4728]: W0125 05:54:12.880983 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdd0ddf2_b2d4_4568_abc5_7fd76cbcb639.slice/crio-a5be560b57040d6518b0b088086bd66c754c70a26ad191473ced932f6417e60c WatchSource:0}: Error finding container a5be560b57040d6518b0b088086bd66c754c70a26ad191473ced932f6417e60c: Status 404 returned error can't find the container with id a5be560b57040d6518b0b088086bd66c754c70a26ad191473ced932f6417e60c Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.927125 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbl2s" event={"ID":"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639","Type":"ContainerStarted","Data":"a5be560b57040d6518b0b088086bd66c754c70a26ad191473ced932f6417e60c"} Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.932703 4728 generic.go:334] "Generic (PLEG): container finished" podID="9d85a5b2-cb44-4190-973a-179ad187fd37" containerID="e21ea8a945a43884e58d6ac8b724dff7592965bb8c6506aaecc4d8213e8f890f" exitCode=0 Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.932792 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nv6tr" event={"ID":"9d85a5b2-cb44-4190-973a-179ad187fd37","Type":"ContainerDied","Data":"e21ea8a945a43884e58d6ac8b724dff7592965bb8c6506aaecc4d8213e8f890f"} Jan 25 05:54:12 crc kubenswrapper[4728]: I0125 05:54:12.936636 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de462b89-925b-42f0-9590-a93b2081cc41","Type":"ContainerStarted","Data":"73bd08adb260d4d7e2fdf8ca7e14d40eb8528b4f961c790504608541f17632a1"} Jan 25 05:54:13 crc kubenswrapper[4728]: I0125 05:54:13.342799 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77" path="/var/lib/kubelet/pods/5f6ba592-a2c6-4e4f-98dc-fbfdf181bd77/volumes" Jan 25 05:54:13 crc kubenswrapper[4728]: I0125 05:54:13.343431 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ce264c-d5a9-4521-b233-f107e3ee8871" path="/var/lib/kubelet/pods/c4ce264c-d5a9-4521-b233-f107e3ee8871/volumes" Jan 25 05:54:13 crc kubenswrapper[4728]: I0125 05:54:13.684474 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:13 crc kubenswrapper[4728]: I0125 05:54:13.748176 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f45b9f4bc-g92ls"] Jan 25 05:54:13 crc kubenswrapper[4728]: I0125 05:54:13.748491 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" podUID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" containerName="dnsmasq-dns" containerID="cri-o://a2ad86531326a1df76df40666b7710cca44a224060a7715e3e62fabb803c797b" gracePeriod=10 Jan 25 05:54:13 crc kubenswrapper[4728]: I0125 05:54:13.973552 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbl2s" event={"ID":"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639","Type":"ContainerStarted","Data":"948d5d216e63f9a49474fb936dfb1138a6f5a8b909b42e01b8bab5c42584911a"} Jan 25 05:54:13 crc kubenswrapper[4728]: I0125 05:54:13.996454 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3989d07c-292d-40ec-ac11-ffce42ffde68","Type":"ContainerStarted","Data":"64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce"} Jan 25 05:54:14 crc kubenswrapper[4728]: I0125 05:54:14.003782 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sbl2s" podStartSLOduration=2.003769966 podStartE2EDuration="2.003769966s" podCreationTimestamp="2026-01-25 05:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:14.001929676 +0000 UTC m=+945.037807656" watchObservedRunningTime="2026-01-25 05:54:14.003769966 +0000 UTC m=+945.039647946" Jan 25 05:54:14 crc kubenswrapper[4728]: I0125 05:54:14.024537 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de462b89-925b-42f0-9590-a93b2081cc41","Type":"ContainerStarted","Data":"87e3a51b46f91c1af523d1cef14e5d09b8e8342f0ca05fb5220400f5eb512081"} Jan 25 05:54:14 crc kubenswrapper[4728]: I0125 05:54:14.077695 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.07767924 podStartE2EDuration="6.07767924s" podCreationTimestamp="2026-01-25 05:54:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:14.065143393 +0000 UTC m=+945.101021362" watchObservedRunningTime="2026-01-25 05:54:14.07767924 +0000 UTC m=+945.113557221" Jan 25 05:54:14 crc kubenswrapper[4728]: I0125 05:54:14.078633 4728 generic.go:334] "Generic (PLEG): container finished" podID="019c7f41-0990-4235-b29d-8d8e08d34af1" containerID="7468c23e4107a46410947d49470ab02fe6f1c86036e819b2bee3284d6485c791" exitCode=0 Jan 25 05:54:14 crc kubenswrapper[4728]: I0125 05:54:14.078714 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pr68r" event={"ID":"019c7f41-0990-4235-b29d-8d8e08d34af1","Type":"ContainerDied","Data":"7468c23e4107a46410947d49470ab02fe6f1c86036e819b2bee3284d6485c791"} Jan 25 05:54:14 crc kubenswrapper[4728]: I0125 05:54:14.088274 4728 generic.go:334] "Generic (PLEG): container finished" podID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" containerID="a2ad86531326a1df76df40666b7710cca44a224060a7715e3e62fabb803c797b" exitCode=0 Jan 25 05:54:14 crc kubenswrapper[4728]: I0125 05:54:14.088489 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" event={"ID":"a8aebef0-3647-4ca2-a703-4d0c2033c7fd","Type":"ContainerDied","Data":"a2ad86531326a1df76df40666b7710cca44a224060a7715e3e62fabb803c797b"} Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.664939 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.741936 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-combined-ca-bundle\") pod \"9d85a5b2-cb44-4190-973a-179ad187fd37\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.742132 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d85a5b2-cb44-4190-973a-179ad187fd37-logs\") pod \"9d85a5b2-cb44-4190-973a-179ad187fd37\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.742174 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-config-data\") pod \"9d85a5b2-cb44-4190-973a-179ad187fd37\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.742304 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58nrl\" (UniqueName: \"kubernetes.io/projected/9d85a5b2-cb44-4190-973a-179ad187fd37-kube-api-access-58nrl\") pod \"9d85a5b2-cb44-4190-973a-179ad187fd37\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.742367 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-scripts\") pod \"9d85a5b2-cb44-4190-973a-179ad187fd37\" (UID: \"9d85a5b2-cb44-4190-973a-179ad187fd37\") " Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.742526 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d85a5b2-cb44-4190-973a-179ad187fd37-logs" (OuterVolumeSpecName: "logs") pod "9d85a5b2-cb44-4190-973a-179ad187fd37" (UID: "9d85a5b2-cb44-4190-973a-179ad187fd37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.743186 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d85a5b2-cb44-4190-973a-179ad187fd37-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.745588 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d85a5b2-cb44-4190-973a-179ad187fd37-kube-api-access-58nrl" (OuterVolumeSpecName: "kube-api-access-58nrl") pod "9d85a5b2-cb44-4190-973a-179ad187fd37" (UID: "9d85a5b2-cb44-4190-973a-179ad187fd37"). InnerVolumeSpecName "kube-api-access-58nrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.745982 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-scripts" (OuterVolumeSpecName: "scripts") pod "9d85a5b2-cb44-4190-973a-179ad187fd37" (UID: "9d85a5b2-cb44-4190-973a-179ad187fd37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.770182 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d85a5b2-cb44-4190-973a-179ad187fd37" (UID: "9d85a5b2-cb44-4190-973a-179ad187fd37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.787787 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-config-data" (OuterVolumeSpecName: "config-data") pod "9d85a5b2-cb44-4190-973a-179ad187fd37" (UID: "9d85a5b2-cb44-4190-973a-179ad187fd37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.844919 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58nrl\" (UniqueName: \"kubernetes.io/projected/9d85a5b2-cb44-4190-973a-179ad187fd37-kube-api-access-58nrl\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.844949 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.844961 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:15 crc kubenswrapper[4728]: I0125 05:54:15.844973 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d85a5b2-cb44-4190-973a-179ad187fd37-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.048153 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.053147 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.105065 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nv6tr" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.105063 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nv6tr" event={"ID":"9d85a5b2-cb44-4190-973a-179ad187fd37","Type":"ContainerDied","Data":"07aa458a4a3bf541541ed2897b42e63c87458423a2ea34da4acf3d11ea5b3146"} Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.105168 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07aa458a4a3bf541541ed2897b42e63c87458423a2ea34da4acf3d11ea5b3146" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.107384 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pr68r" event={"ID":"019c7f41-0990-4235-b29d-8d8e08d34af1","Type":"ContainerDied","Data":"5e825ebf491278e91f53431df679230f21c0ef962d5cf85569b8eeef967c4445"} Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.107429 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pr68r" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.107437 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e825ebf491278e91f53431df679230f21c0ef962d5cf85569b8eeef967c4445" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.111146 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" event={"ID":"a8aebef0-3647-4ca2-a703-4d0c2033c7fd","Type":"ContainerDied","Data":"a88c3507dbfac6a89410f30fc25bcd17f0e6762d29e4ac3a388b7315cec1887b"} Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.111184 4728 scope.go:117] "RemoveContainer" containerID="a2ad86531326a1df76df40666b7710cca44a224060a7715e3e62fabb803c797b" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.111188 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f45b9f4bc-g92ls" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.147204 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-config\") pod \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.147368 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-combined-ca-bundle\") pod \"019c7f41-0990-4235-b29d-8d8e08d34af1\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.147481 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-svc\") pod \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.148456 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwdcm\" (UniqueName: \"kubernetes.io/projected/019c7f41-0990-4235-b29d-8d8e08d34af1-kube-api-access-fwdcm\") pod \"019c7f41-0990-4235-b29d-8d8e08d34af1\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.148494 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-sb\") pod \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.148577 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-nb\") pod \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.148709 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-config\") pod \"019c7f41-0990-4235-b29d-8d8e08d34af1\" (UID: \"019c7f41-0990-4235-b29d-8d8e08d34af1\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.148771 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-swift-storage-0\") pod \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.148796 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgbmh\" (UniqueName: \"kubernetes.io/projected/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-kube-api-access-hgbmh\") pod \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\" (UID: \"a8aebef0-3647-4ca2-a703-4d0c2033c7fd\") " Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.154895 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019c7f41-0990-4235-b29d-8d8e08d34af1-kube-api-access-fwdcm" (OuterVolumeSpecName: "kube-api-access-fwdcm") pod "019c7f41-0990-4235-b29d-8d8e08d34af1" (UID: "019c7f41-0990-4235-b29d-8d8e08d34af1"). InnerVolumeSpecName "kube-api-access-fwdcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.157819 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-kube-api-access-hgbmh" (OuterVolumeSpecName: "kube-api-access-hgbmh") pod "a8aebef0-3647-4ca2-a703-4d0c2033c7fd" (UID: "a8aebef0-3647-4ca2-a703-4d0c2033c7fd"). InnerVolumeSpecName "kube-api-access-hgbmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.183150 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-config" (OuterVolumeSpecName: "config") pod "019c7f41-0990-4235-b29d-8d8e08d34af1" (UID: "019c7f41-0990-4235-b29d-8d8e08d34af1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.187670 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "019c7f41-0990-4235-b29d-8d8e08d34af1" (UID: "019c7f41-0990-4235-b29d-8d8e08d34af1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.196521 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a8aebef0-3647-4ca2-a703-4d0c2033c7fd" (UID: "a8aebef0-3647-4ca2-a703-4d0c2033c7fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.198497 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a8aebef0-3647-4ca2-a703-4d0c2033c7fd" (UID: "a8aebef0-3647-4ca2-a703-4d0c2033c7fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.202506 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a8aebef0-3647-4ca2-a703-4d0c2033c7fd" (UID: "a8aebef0-3647-4ca2-a703-4d0c2033c7fd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.203103 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8aebef0-3647-4ca2-a703-4d0c2033c7fd" (UID: "a8aebef0-3647-4ca2-a703-4d0c2033c7fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.209054 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-config" (OuterVolumeSpecName: "config") pod "a8aebef0-3647-4ca2-a703-4d0c2033c7fd" (UID: "a8aebef0-3647-4ca2-a703-4d0c2033c7fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251252 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251298 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251311 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251352 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwdcm\" (UniqueName: \"kubernetes.io/projected/019c7f41-0990-4235-b29d-8d8e08d34af1-kube-api-access-fwdcm\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251362 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251371 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251383 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/019c7f41-0990-4235-b29d-8d8e08d34af1-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251391 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.251398 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgbmh\" (UniqueName: \"kubernetes.io/projected/a8aebef0-3647-4ca2-a703-4d0c2033c7fd-kube-api-access-hgbmh\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.293646 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-975f74cf9-g4lbb"] Jan 25 05:54:16 crc kubenswrapper[4728]: E0125 05:54:16.294048 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d85a5b2-cb44-4190-973a-179ad187fd37" containerName="placement-db-sync" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.307127 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d85a5b2-cb44-4190-973a-179ad187fd37" containerName="placement-db-sync" Jan 25 05:54:16 crc kubenswrapper[4728]: E0125 05:54:16.307161 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019c7f41-0990-4235-b29d-8d8e08d34af1" containerName="neutron-db-sync" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.307169 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="019c7f41-0990-4235-b29d-8d8e08d34af1" containerName="neutron-db-sync" Jan 25 05:54:16 crc kubenswrapper[4728]: E0125 05:54:16.307187 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" containerName="init" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.307193 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" containerName="init" Jan 25 05:54:16 crc kubenswrapper[4728]: E0125 05:54:16.307222 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" containerName="dnsmasq-dns" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.307357 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" containerName="dnsmasq-dns" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.307588 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" containerName="dnsmasq-dns" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.307619 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d85a5b2-cb44-4190-973a-179ad187fd37" containerName="placement-db-sync" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.307637 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="019c7f41-0990-4235-b29d-8d8e08d34af1" containerName="neutron-db-sync" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.308532 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-975f74cf9-g4lbb"] Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.308635 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.353436 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-sb\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.353485 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-swift-storage-0\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.353533 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h9tw\" (UniqueName: \"kubernetes.io/projected/d7c8b502-0768-4226-aae0-f6e9f639cb9a-kube-api-access-7h9tw\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.353566 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-nb\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.353638 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-config\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.353675 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-svc\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.448949 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f45b9f4bc-g92ls"] Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.455598 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-sb\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.455631 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-swift-storage-0\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.455666 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h9tw\" (UniqueName: \"kubernetes.io/projected/d7c8b502-0768-4226-aae0-f6e9f639cb9a-kube-api-access-7h9tw\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.455687 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-nb\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.455750 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-config\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.455783 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-svc\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.456796 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-nb\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.457055 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-swift-storage-0\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.457275 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-svc\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.457685 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-sb\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.459614 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-config\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.459843 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f45b9f4bc-g92ls"] Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.472620 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h9tw\" (UniqueName: \"kubernetes.io/projected/d7c8b502-0768-4226-aae0-f6e9f639cb9a-kube-api-access-7h9tw\") pod \"dnsmasq-dns-975f74cf9-g4lbb\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.498569 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55586495d8-v5qdm"] Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.501256 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.509927 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.509940 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.509991 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pf6m5" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.510115 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.512286 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55586495d8-v5qdm"] Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.558022 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-combined-ca-bundle\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.558073 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdz4n\" (UniqueName: \"kubernetes.io/projected/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-kube-api-access-gdz4n\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.558119 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-httpd-config\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.558304 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-ovndb-tls-certs\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.558385 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-config\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.628871 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.674465 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-combined-ca-bundle\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.676092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdz4n\" (UniqueName: \"kubernetes.io/projected/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-kube-api-access-gdz4n\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.676140 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-httpd-config\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.676462 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-ovndb-tls-certs\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.676528 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-config\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.683291 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-httpd-config\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.683601 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-config\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.697160 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-ovndb-tls-certs\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.697450 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-combined-ca-bundle\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.699895 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdz4n\" (UniqueName: \"kubernetes.io/projected/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-kube-api-access-gdz4n\") pod \"neutron-55586495d8-v5qdm\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.814538 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f74fbc68-hj87v"] Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.817234 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.817248 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.821157 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.821386 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tl8s8" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.821424 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.821529 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.821545 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.823397 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f74fbc68-hj87v"] Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.881679 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-public-tls-certs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.881731 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-scripts\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.881789 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhj7\" (UniqueName: \"kubernetes.io/projected/e537ee66-7c17-4eb1-a0ce-262f4c260d16-kube-api-access-hkhj7\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.881855 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-combined-ca-bundle\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.882011 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-config-data\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.882134 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-internal-tls-certs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.882196 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e537ee66-7c17-4eb1-a0ce-262f4c260d16-logs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.983510 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-internal-tls-certs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.983559 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e537ee66-7c17-4eb1-a0ce-262f4c260d16-logs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.983607 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-public-tls-certs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.983624 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-scripts\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.983645 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhj7\" (UniqueName: \"kubernetes.io/projected/e537ee66-7c17-4eb1-a0ce-262f4c260d16-kube-api-access-hkhj7\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.983663 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-combined-ca-bundle\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.983700 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-config-data\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.984855 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e537ee66-7c17-4eb1-a0ce-262f4c260d16-logs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.987637 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-config-data\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.988150 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-internal-tls-certs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.990730 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-scripts\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.990785 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-public-tls-certs\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:16 crc kubenswrapper[4728]: I0125 05:54:16.992432 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e537ee66-7c17-4eb1-a0ce-262f4c260d16-combined-ca-bundle\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:17 crc kubenswrapper[4728]: I0125 05:54:17.007783 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhj7\" (UniqueName: \"kubernetes.io/projected/e537ee66-7c17-4eb1-a0ce-262f4c260d16-kube-api-access-hkhj7\") pod \"placement-7f74fbc68-hj87v\" (UID: \"e537ee66-7c17-4eb1-a0ce-262f4c260d16\") " pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:17 crc kubenswrapper[4728]: I0125 05:54:17.130605 4728 generic.go:334] "Generic (PLEG): container finished" podID="bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" containerID="948d5d216e63f9a49474fb936dfb1138a6f5a8b909b42e01b8bab5c42584911a" exitCode=0 Jan 25 05:54:17 crc kubenswrapper[4728]: I0125 05:54:17.130651 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbl2s" event={"ID":"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639","Type":"ContainerDied","Data":"948d5d216e63f9a49474fb936dfb1138a6f5a8b909b42e01b8bab5c42584911a"} Jan 25 05:54:17 crc kubenswrapper[4728]: I0125 05:54:17.136685 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:17 crc kubenswrapper[4728]: I0125 05:54:17.343866 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8aebef0-3647-4ca2-a703-4d0c2033c7fd" path="/var/lib/kubelet/pods/a8aebef0-3647-4ca2-a703-4d0c2033c7fd/volumes" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.207670 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9455cf8b5-xnn9p"] Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.210019 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.210166 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9455cf8b5-xnn9p"] Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.224448 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.224766 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.311883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-ovndb-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.311967 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-internal-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.311993 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-combined-ca-bundle\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.312017 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-config\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.312042 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdwbx\" (UniqueName: \"kubernetes.io/projected/05936d12-72c1-4916-a564-2f4a886c2a0d-kube-api-access-gdwbx\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.312058 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-httpd-config\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.312077 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-public-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.414064 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-internal-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.414115 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-combined-ca-bundle\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.414144 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-config\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.414199 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdwbx\" (UniqueName: \"kubernetes.io/projected/05936d12-72c1-4916-a564-2f4a886c2a0d-kube-api-access-gdwbx\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.414221 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-httpd-config\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.414241 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-public-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.414287 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-ovndb-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.422997 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-internal-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.425595 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-config\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.425736 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-public-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.430051 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-httpd-config\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.432429 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-combined-ca-bundle\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.434964 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-ovndb-tls-certs\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.436694 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdwbx\" (UniqueName: \"kubernetes.io/projected/05936d12-72c1-4916-a564-2f4a886c2a0d-kube-api-access-gdwbx\") pod \"neutron-9455cf8b5-xnn9p\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:18 crc kubenswrapper[4728]: I0125 05:54:18.537744 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.200094 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.200382 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.214088 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.240577 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.251191 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.329896 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-credential-keys\") pod \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.329956 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-config-data\") pod \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.329977 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2qfp\" (UniqueName: \"kubernetes.io/projected/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-kube-api-access-v2qfp\") pod \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.330002 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-combined-ca-bundle\") pod \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.330132 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-scripts\") pod \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.330248 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-fernet-keys\") pod \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\" (UID: \"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639\") " Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.335604 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-kube-api-access-v2qfp" (OuterVolumeSpecName: "kube-api-access-v2qfp") pod "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" (UID: "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639"). InnerVolumeSpecName "kube-api-access-v2qfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.335690 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-scripts" (OuterVolumeSpecName: "scripts") pod "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" (UID: "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.335612 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" (UID: "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.337531 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" (UID: "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.362600 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-config-data" (OuterVolumeSpecName: "config-data") pod "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" (UID: "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.367709 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" (UID: "bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.435291 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.435340 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2qfp\" (UniqueName: \"kubernetes.io/projected/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-kube-api-access-v2qfp\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.435352 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.435361 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.435369 4728 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.435378 4728 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:19 crc kubenswrapper[4728]: I0125 05:54:19.735528 4728 scope.go:117] "RemoveContainer" containerID="e5ad7ccaa64be3ac12d1755bce860c7140dbc38b021130af937cc88b1d052130" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.180185 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-72kn4" event={"ID":"554cda4a-e73e-4f9c-93aa-23c41ef468a5","Type":"ContainerStarted","Data":"e1956bf2c5c63b58012e403c54c3d0afeeca580ba94af54e2770a7b7cb79ea2a"} Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.184824 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbl2s" event={"ID":"bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639","Type":"ContainerDied","Data":"a5be560b57040d6518b0b088086bd66c754c70a26ad191473ced932f6417e60c"} Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.184849 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5be560b57040d6518b0b088086bd66c754c70a26ad191473ced932f6417e60c" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.184860 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbl2s" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.186912 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerStarted","Data":"7d68cf169dbb0066570e7f0b26179fa51d8c9197dfb09193001d709d71b208a0"} Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.190145 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.190451 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.206115 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-72kn4" podStartSLOduration=1.726182992 podStartE2EDuration="17.206100651s" podCreationTimestamp="2026-01-25 05:54:03 +0000 UTC" firstStartedPulling="2026-01-25 05:54:04.287194136 +0000 UTC m=+935.323072116" lastFinishedPulling="2026-01-25 05:54:19.767111785 +0000 UTC m=+950.802989775" observedRunningTime="2026-01-25 05:54:20.196125391 +0000 UTC m=+951.232003391" watchObservedRunningTime="2026-01-25 05:54:20.206100651 +0000 UTC m=+951.241978632" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.263758 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-975f74cf9-g4lbb"] Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.319141 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9455cf8b5-xnn9p"] Jan 25 05:54:20 crc kubenswrapper[4728]: E0125 05:54:20.328620 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice/crio-0c1887725114d5b3f8a5defd689e644c5b714345eec75604d6b887171078a72c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9aaed5_97ec_40d8_97dc_9783fd1c682f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice/crio-30c4284493f69b015ba9dae947c1d013ae108d50d6be2b23249fe5205d8a79b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf132cd80_c760_445f_b6bf_41d35700b35c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdd0ddf2_b2d4_4568_abc5_7fd76cbcb639.slice\": RecentStats: unable to find data in memory cache]" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.330315 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-64dbb5f568-n5f5j"] Jan 25 05:54:20 crc kubenswrapper[4728]: E0125 05:54:20.330977 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" containerName="keystone-bootstrap" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.330999 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" containerName="keystone-bootstrap" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.331175 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" containerName="keystone-bootstrap" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.331951 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.336045 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lk66m" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.344114 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-64dbb5f568-n5f5j"] Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.349642 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.349797 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.349996 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.350027 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.350453 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.387374 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f74fbc68-hj87v"] Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.396918 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55586495d8-v5qdm"] Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.468754 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-fernet-keys\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.468980 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-scripts\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.469013 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-internal-tls-certs\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.469045 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-config-data\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.469069 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpks2\" (UniqueName: \"kubernetes.io/projected/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-kube-api-access-tpks2\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.469107 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-public-tls-certs\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.469132 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-combined-ca-bundle\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.469193 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-credential-keys\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.571509 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-credential-keys\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.571580 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-fernet-keys\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.571609 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-scripts\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.571635 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-internal-tls-certs\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.571664 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-config-data\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.571686 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpks2\" (UniqueName: \"kubernetes.io/projected/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-kube-api-access-tpks2\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.571725 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-public-tls-certs\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.571745 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-combined-ca-bundle\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.575912 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-scripts\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.576045 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-config-data\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.576809 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-credential-keys\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.580831 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-combined-ca-bundle\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.582786 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-fernet-keys\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.584755 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-public-tls-certs\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.584964 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-internal-tls-certs\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.588747 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpks2\" (UniqueName: \"kubernetes.io/projected/eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581-kube-api-access-tpks2\") pod \"keystone-64dbb5f568-n5f5j\" (UID: \"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581\") " pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:20 crc kubenswrapper[4728]: I0125 05:54:20.660492 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.114515 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-64dbb5f568-n5f5j"] Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.157834 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55586495d8-v5qdm"] Jan 25 05:54:21 crc kubenswrapper[4728]: W0125 05:54:21.179420 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeca4ef36_bc3a_42aa_8ab1_6a6cfdbee581.slice/crio-1fb0927ebe2d5393b01f1407c4ef1cc1d3a680b8ac8ee8f50666a85bb71c9732 WatchSource:0}: Error finding container 1fb0927ebe2d5393b01f1407c4ef1cc1d3a680b8ac8ee8f50666a85bb71c9732: Status 404 returned error can't find the container with id 1fb0927ebe2d5393b01f1407c4ef1cc1d3a680b8ac8ee8f50666a85bb71c9732 Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.180287 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54985bc57c-7dmw7"] Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.189019 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.220308 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54985bc57c-7dmw7"] Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.226389 4728 generic.go:334] "Generic (PLEG): container finished" podID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" containerID="d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916" exitCode=0 Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.226496 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" event={"ID":"d7c8b502-0768-4226-aae0-f6e9f639cb9a","Type":"ContainerDied","Data":"d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.226536 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" event={"ID":"d7c8b502-0768-4226-aae0-f6e9f639cb9a","Type":"ContainerStarted","Data":"170a2a1df22a60660e997f28fa09a00dfa164234d9c81963bf289fa20eed733c"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.231007 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9455cf8b5-xnn9p" event={"ID":"05936d12-72c1-4916-a564-2f4a886c2a0d","Type":"ContainerStarted","Data":"45b6c38a6adb6ecce8de335ca3fff0c4f011cf28e4e5b7fab52ca0f28844ae5b"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.231056 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9455cf8b5-xnn9p" event={"ID":"05936d12-72c1-4916-a564-2f4a886c2a0d","Type":"ContainerStarted","Data":"4a228917f27cd5686220fc79ab81b314d5ae93e65112b53893ee747ae6b66c60"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.236184 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-64dbb5f568-n5f5j" event={"ID":"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581","Type":"ContainerStarted","Data":"1fb0927ebe2d5393b01f1407c4ef1cc1d3a680b8ac8ee8f50666a85bb71c9732"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.251305 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de462b89-925b-42f0-9590-a93b2081cc41","Type":"ContainerStarted","Data":"568fecdccb03cf4c054e2764aff883c0a4450be038f180f93ddca486d69c8768"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.274469 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f74fbc68-hj87v" event={"ID":"e537ee66-7c17-4eb1-a0ce-262f4c260d16","Type":"ContainerStarted","Data":"0a540f1b4d87c095b52e7394e80baaafd48e5c1afcac8968f983f9004cc95bbb"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.274521 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f74fbc68-hj87v" event={"ID":"e537ee66-7c17-4eb1-a0ce-262f4c260d16","Type":"ContainerStarted","Data":"b3a0fa6f25a77c6891bc26c594c96d638e0ba8b186b637cd2e4beaf9135e92f7"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.288248 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55586495d8-v5qdm" event={"ID":"16759a9d-9f02-4e00-b4e4-25ab295d6ffb","Type":"ContainerStarted","Data":"b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.288288 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55586495d8-v5qdm" event={"ID":"16759a9d-9f02-4e00-b4e4-25ab295d6ffb","Type":"ContainerStarted","Data":"5b3c14eb8b48ec8305fa632d99b594a356be56d374cf2814600a10f57f5525e1"} Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.311605 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.311584435 podStartE2EDuration="10.311584435s" podCreationTimestamp="2026-01-25 05:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:21.280869548 +0000 UTC m=+952.316747528" watchObservedRunningTime="2026-01-25 05:54:21.311584435 +0000 UTC m=+952.347462415" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.315447 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9vvn\" (UniqueName: \"kubernetes.io/projected/0d93e327-c397-427e-abe4-0065144bcb7a-kube-api-access-t9vvn\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.315597 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-ovndb-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.315641 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-config\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.315723 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-internal-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.315858 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-combined-ca-bundle\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.315883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-public-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.316028 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-httpd-config\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.437170 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9vvn\" (UniqueName: \"kubernetes.io/projected/0d93e327-c397-427e-abe4-0065144bcb7a-kube-api-access-t9vvn\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.437225 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-ovndb-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.437261 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-config\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.437358 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-internal-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.437415 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-combined-ca-bundle\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.437435 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-public-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.437462 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-httpd-config\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.448029 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-ovndb-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.448795 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-httpd-config\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.452929 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-internal-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.453438 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-config\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.458972 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-public-tls-certs\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.462865 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d93e327-c397-427e-abe4-0065144bcb7a-combined-ca-bundle\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.467526 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9vvn\" (UniqueName: \"kubernetes.io/projected/0d93e327-c397-427e-abe4-0065144bcb7a-kube-api-access-t9vvn\") pod \"neutron-54985bc57c-7dmw7\" (UID: \"0d93e327-c397-427e-abe4-0065144bcb7a\") " pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.519681 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.943800 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 25 05:54:21 crc kubenswrapper[4728]: I0125 05:54:21.944436 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.116472 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54985bc57c-7dmw7"] Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.292070 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.292112 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.296805 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f74fbc68-hj87v" event={"ID":"e537ee66-7c17-4eb1-a0ce-262f4c260d16","Type":"ContainerStarted","Data":"1bd1f09b03329e08265f108fddef506020b1b5f89b345bbaf26147cae67ad0f2"} Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.297716 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.297743 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.305854 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55586495d8-v5qdm" event={"ID":"16759a9d-9f02-4e00-b4e4-25ab295d6ffb","Type":"ContainerStarted","Data":"0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5"} Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.305907 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.305903 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55586495d8-v5qdm" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-api" containerID="cri-o://b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358" gracePeriod=30 Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.305916 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55586495d8-v5qdm" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-httpd" containerID="cri-o://0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5" gracePeriod=30 Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.308561 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" event={"ID":"d7c8b502-0768-4226-aae0-f6e9f639cb9a","Type":"ContainerStarted","Data":"a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6"} Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.309240 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.320422 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f74fbc68-hj87v" podStartSLOduration=6.320410959 podStartE2EDuration="6.320410959s" podCreationTimestamp="2026-01-25 05:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:22.311366694 +0000 UTC m=+953.347244675" watchObservedRunningTime="2026-01-25 05:54:22.320410959 +0000 UTC m=+953.356288929" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.322179 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9455cf8b5-xnn9p" event={"ID":"05936d12-72c1-4916-a564-2f4a886c2a0d","Type":"ContainerStarted","Data":"0073b56744b284a02715ef6b0aa4ddac0bbf6877826227bf5355e772f97ecb13"} Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.322701 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.330496 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-64dbb5f568-n5f5j" event={"ID":"eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581","Type":"ContainerStarted","Data":"f35ebde9ca681a7569b052fec48ddf3c475b6e32d6528f7f069050e1d4007f76"} Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.330900 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.334043 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" podStartSLOduration=6.334033516 podStartE2EDuration="6.334033516s" podCreationTimestamp="2026-01-25 05:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:22.328656077 +0000 UTC m=+953.364534056" watchObservedRunningTime="2026-01-25 05:54:22.334033516 +0000 UTC m=+953.369911496" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.338799 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.339890 4728 generic.go:334] "Generic (PLEG): container finished" podID="554cda4a-e73e-4f9c-93aa-23c41ef468a5" containerID="e1956bf2c5c63b58012e403c54c3d0afeeca580ba94af54e2770a7b7cb79ea2a" exitCode=0 Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.340025 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-72kn4" event={"ID":"554cda4a-e73e-4f9c-93aa-23c41ef468a5","Type":"ContainerDied","Data":"e1956bf2c5c63b58012e403c54c3d0afeeca580ba94af54e2770a7b7cb79ea2a"} Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.340440 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.349189 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55586495d8-v5qdm" podStartSLOduration=6.349181399 podStartE2EDuration="6.349181399s" podCreationTimestamp="2026-01-25 05:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:22.344441923 +0000 UTC m=+953.380319902" watchObservedRunningTime="2026-01-25 05:54:22.349181399 +0000 UTC m=+953.385059370" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.354815 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.377950 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9455cf8b5-xnn9p" podStartSLOduration=4.37793105 podStartE2EDuration="4.37793105s" podCreationTimestamp="2026-01-25 05:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:22.377303667 +0000 UTC m=+953.413181647" watchObservedRunningTime="2026-01-25 05:54:22.37793105 +0000 UTC m=+953.413809030" Jan 25 05:54:22 crc kubenswrapper[4728]: I0125 05:54:22.418007 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-64dbb5f568-n5f5j" podStartSLOduration=2.417992742 podStartE2EDuration="2.417992742s" podCreationTimestamp="2026-01-25 05:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:22.411200313 +0000 UTC m=+953.447078293" watchObservedRunningTime="2026-01-25 05:54:22.417992742 +0000 UTC m=+953.453870722" Jan 25 05:54:23 crc kubenswrapper[4728]: I0125 05:54:23.351199 4728 generic.go:334] "Generic (PLEG): container finished" podID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerID="0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5" exitCode=0 Jan 25 05:54:23 crc kubenswrapper[4728]: I0125 05:54:23.351297 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55586495d8-v5qdm" event={"ID":"16759a9d-9f02-4e00-b4e4-25ab295d6ffb","Type":"ContainerDied","Data":"0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5"} Jan 25 05:54:23 crc kubenswrapper[4728]: I0125 05:54:23.353143 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54985bc57c-7dmw7" event={"ID":"0d93e327-c397-427e-abe4-0065144bcb7a","Type":"ContainerStarted","Data":"9cc4c6c12a9b94e6727f2d63a3e47e92148372dd3c0d51721dcae2fe0a94dc53"} Jan 25 05:54:23 crc kubenswrapper[4728]: I0125 05:54:23.353541 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:23 crc kubenswrapper[4728]: I0125 05:54:23.795299 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:26 crc kubenswrapper[4728]: I0125 05:54:26.630571 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:26 crc kubenswrapper[4728]: I0125 05:54:26.699919 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8555c65755-z9ttf"] Jan 25 05:54:26 crc kubenswrapper[4728]: I0125 05:54:26.701620 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" podUID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerName="dnsmasq-dns" containerID="cri-o://05bd7b472dbb0bc6510904e0b5f17ed1c408264ec0244c3f3aab2db674107e61" gracePeriod=10 Jan 25 05:54:26 crc kubenswrapper[4728]: I0125 05:54:26.898997 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 25 05:54:27 crc kubenswrapper[4728]: I0125 05:54:27.390567 4728 generic.go:334] "Generic (PLEG): container finished" podID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerID="05bd7b472dbb0bc6510904e0b5f17ed1c408264ec0244c3f3aab2db674107e61" exitCode=0 Jan 25 05:54:27 crc kubenswrapper[4728]: I0125 05:54:27.390628 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" event={"ID":"44d23e17-65a8-4719-a9ca-a69f392fdbf3","Type":"ContainerDied","Data":"05bd7b472dbb0bc6510904e0b5f17ed1c408264ec0244c3f3aab2db674107e61"} Jan 25 05:54:28 crc kubenswrapper[4728]: I0125 05:54:28.682992 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" podUID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Jan 25 05:54:29 crc kubenswrapper[4728]: E0125 05:54:29.376687 4728 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/0e3dcf3cbfa5bdd4b2e49f5e7e822683d073c50ca486b655bd62c81f3b25b899/diff" to get inode usage: stat /var/lib/containers/storage/overlay/0e3dcf3cbfa5bdd4b2e49f5e7e822683d073c50ca486b655bd62c81f3b25b899/diff: no such file or directory, extraDiskErr: Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.033266 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.133199 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-db-sync-config-data\") pod \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.133696 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-combined-ca-bundle\") pod \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.133888 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grzxl\" (UniqueName: \"kubernetes.io/projected/554cda4a-e73e-4f9c-93aa-23c41ef468a5-kube-api-access-grzxl\") pod \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\" (UID: \"554cda4a-e73e-4f9c-93aa-23c41ef468a5\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.137444 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "554cda4a-e73e-4f9c-93aa-23c41ef468a5" (UID: "554cda4a-e73e-4f9c-93aa-23c41ef468a5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.139411 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/554cda4a-e73e-4f9c-93aa-23c41ef468a5-kube-api-access-grzxl" (OuterVolumeSpecName: "kube-api-access-grzxl") pod "554cda4a-e73e-4f9c-93aa-23c41ef468a5" (UID: "554cda4a-e73e-4f9c-93aa-23c41ef468a5"). InnerVolumeSpecName "kube-api-access-grzxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.167298 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "554cda4a-e73e-4f9c-93aa-23c41ef468a5" (UID: "554cda4a-e73e-4f9c-93aa-23c41ef468a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.236728 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.236759 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grzxl\" (UniqueName: \"kubernetes.io/projected/554cda4a-e73e-4f9c-93aa-23c41ef468a5-kube-api-access-grzxl\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.236773 4728 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/554cda4a-e73e-4f9c-93aa-23c41ef468a5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.249128 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.337858 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-sb\") pod \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.337905 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-config\") pod \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.337943 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-nb\") pod \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.338729 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-svc\") pod \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.338761 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-swift-storage-0\") pod \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.338833 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tclv5\" (UniqueName: \"kubernetes.io/projected/44d23e17-65a8-4719-a9ca-a69f392fdbf3-kube-api-access-tclv5\") pod \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\" (UID: \"44d23e17-65a8-4719-a9ca-a69f392fdbf3\") " Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.342858 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d23e17-65a8-4719-a9ca-a69f392fdbf3-kube-api-access-tclv5" (OuterVolumeSpecName: "kube-api-access-tclv5") pod "44d23e17-65a8-4719-a9ca-a69f392fdbf3" (UID: "44d23e17-65a8-4719-a9ca-a69f392fdbf3"). InnerVolumeSpecName "kube-api-access-tclv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.369275 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44d23e17-65a8-4719-a9ca-a69f392fdbf3" (UID: "44d23e17-65a8-4719-a9ca-a69f392fdbf3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.374237 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-config" (OuterVolumeSpecName: "config") pod "44d23e17-65a8-4719-a9ca-a69f392fdbf3" (UID: "44d23e17-65a8-4719-a9ca-a69f392fdbf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.378270 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "44d23e17-65a8-4719-a9ca-a69f392fdbf3" (UID: "44d23e17-65a8-4719-a9ca-a69f392fdbf3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.385436 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44d23e17-65a8-4719-a9ca-a69f392fdbf3" (UID: "44d23e17-65a8-4719-a9ca-a69f392fdbf3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.385583 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "44d23e17-65a8-4719-a9ca-a69f392fdbf3" (UID: "44d23e17-65a8-4719-a9ca-a69f392fdbf3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.433857 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-72kn4" event={"ID":"554cda4a-e73e-4f9c-93aa-23c41ef468a5","Type":"ContainerDied","Data":"c713be16219343ff728926c82a077f873b9006ecd13b18683f9868fda2231efe"} Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.434105 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c713be16219343ff728926c82a077f873b9006ecd13b18683f9868fda2231efe" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.433876 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-72kn4" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.435633 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerStarted","Data":"bb02b25ccc758d25fe4d3563f6da4e8d2e86e6d20c03953fa5e70c7b5ca0ae42"} Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.438593 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54985bc57c-7dmw7" event={"ID":"0d93e327-c397-427e-abe4-0065144bcb7a","Type":"ContainerStarted","Data":"7fbdc739f2f5d60a4b0846bb40101526376ddc9f25e360b880a6a427d05e68c6"} Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.438634 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54985bc57c-7dmw7" event={"ID":"0d93e327-c397-427e-abe4-0065144bcb7a","Type":"ContainerStarted","Data":"c947f2b71240c85e80f5f942ebbf990cb2aa45c9519013e2f346628bf9a4ea04"} Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.439088 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.440687 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.440730 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.440741 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tclv5\" (UniqueName: \"kubernetes.io/projected/44d23e17-65a8-4719-a9ca-a69f392fdbf3-kube-api-access-tclv5\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.440751 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.440761 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.440769 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d23e17-65a8-4719-a9ca-a69f392fdbf3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.443366 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" event={"ID":"44d23e17-65a8-4719-a9ca-a69f392fdbf3","Type":"ContainerDied","Data":"2bc82c19f5edb4b92db97c268f72026b3b6633dc886b7e3b45f68135bfcd0129"} Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.443409 4728 scope.go:117] "RemoveContainer" containerID="05bd7b472dbb0bc6510904e0b5f17ed1c408264ec0244c3f3aab2db674107e61" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.443415 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8555c65755-z9ttf" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.471422 4728 scope.go:117] "RemoveContainer" containerID="0ae8b94e04250731499178bf826d25612cf99df7a25abe2dde637680e4f7b3c3" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.475919 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54985bc57c-7dmw7" podStartSLOduration=10.475907065 podStartE2EDuration="10.475907065s" podCreationTimestamp="2026-01-25 05:54:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:31.452072802 +0000 UTC m=+962.487950782" watchObservedRunningTime="2026-01-25 05:54:31.475907065 +0000 UTC m=+962.511785045" Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.483024 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8555c65755-z9ttf"] Jan 25 05:54:31 crc kubenswrapper[4728]: I0125 05:54:31.487537 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8555c65755-z9ttf"] Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.224878 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b587b754-6vhmj"] Jan 25 05:54:32 crc kubenswrapper[4728]: E0125 05:54:32.225443 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerName="init" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.225457 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerName="init" Jan 25 05:54:32 crc kubenswrapper[4728]: E0125 05:54:32.225475 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="554cda4a-e73e-4f9c-93aa-23c41ef468a5" containerName="barbican-db-sync" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.225483 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="554cda4a-e73e-4f9c-93aa-23c41ef468a5" containerName="barbican-db-sync" Jan 25 05:54:32 crc kubenswrapper[4728]: E0125 05:54:32.225493 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerName="dnsmasq-dns" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.225500 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerName="dnsmasq-dns" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.225658 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" containerName="dnsmasq-dns" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.225687 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="554cda4a-e73e-4f9c-93aa-23c41ef468a5" containerName="barbican-db-sync" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.253197 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b587b754-6vhmj"] Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.253285 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.262166 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ngjgt" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.262393 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-755865797c-j8rm6"] Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.262491 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.262538 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.263777 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.277999 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b49f9f9b7-2ld95"] Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.279335 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.281924 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b49f9f9b7-2ld95"] Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.288389 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.298063 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-755865797c-j8rm6"] Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.365911 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-config-data\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.365961 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-swift-storage-0\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.365995 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-config-data-custom\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366018 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd2e0efe-0434-4103-a08d-d014f69addf6-logs\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366040 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-config\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366063 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-sb\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366080 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69tdf\" (UniqueName: \"kubernetes.io/projected/2d6794a2-d312-4727-9196-d030dea17b67-kube-api-access-69tdf\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366103 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-nb\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366135 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-combined-ca-bundle\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366169 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-svc\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366207 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj2bn\" (UniqueName: \"kubernetes.io/projected/fd2e0efe-0434-4103-a08d-d014f69addf6-kube-api-access-tj2bn\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366228 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktr8\" (UniqueName: \"kubernetes.io/projected/4564c893-fa22-47c0-92b9-4d503b3553ee-kube-api-access-dktr8\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366276 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-config-data\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366290 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-combined-ca-bundle\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366310 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-config-data-custom\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.366367 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4564c893-fa22-47c0-92b9-4d503b3553ee-logs\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.374351 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5479665bfd-x264b"] Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.375913 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.378746 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.383431 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5479665bfd-x264b"] Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.462536 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-csg7p" event={"ID":"74d5365c-76ac-4544-b1e2-ae442ee191dd","Type":"ContainerStarted","Data":"71c95086baeef57765b0e9204bbd38d6bb79ab580e92ae66a96331630c9baeb2"} Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467713 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data-custom\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467758 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-config-data\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467790 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-swift-storage-0\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467816 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6afcd9c-55bf-4a07-b8bf-311d452bc324-logs\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467838 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-config-data-custom\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467859 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd2e0efe-0434-4103-a08d-d014f69addf6-logs\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467878 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-config\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467908 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-sb\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467924 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69tdf\" (UniqueName: \"kubernetes.io/projected/2d6794a2-d312-4727-9196-d030dea17b67-kube-api-access-69tdf\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467942 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-nb\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467967 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-combined-ca-bundle\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.467989 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-svc\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468024 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-combined-ca-bundle\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468046 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj2bn\" (UniqueName: \"kubernetes.io/projected/fd2e0efe-0434-4103-a08d-d014f69addf6-kube-api-access-tj2bn\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468064 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468083 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktr8\" (UniqueName: \"kubernetes.io/projected/4564c893-fa22-47c0-92b9-4d503b3553ee-kube-api-access-dktr8\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468121 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-config-data\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468137 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-combined-ca-bundle\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468158 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-config-data-custom\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468181 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4564c893-fa22-47c0-92b9-4d503b3553ee-logs\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.468207 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42x2k\" (UniqueName: \"kubernetes.io/projected/d6afcd9c-55bf-4a07-b8bf-311d452bc324-kube-api-access-42x2k\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.471963 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd2e0efe-0434-4103-a08d-d014f69addf6-logs\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.472243 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-sb\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.472781 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-svc\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.473117 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-swift-storage-0\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.475723 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4564c893-fa22-47c0-92b9-4d503b3553ee-logs\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.475905 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-config\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.476699 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-nb\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.478286 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-csg7p" podStartSLOduration=2.551130512 podStartE2EDuration="29.478264812s" podCreationTimestamp="2026-01-25 05:54:03 +0000 UTC" firstStartedPulling="2026-01-25 05:54:04.127070474 +0000 UTC m=+935.162948454" lastFinishedPulling="2026-01-25 05:54:31.054204774 +0000 UTC m=+962.090082754" observedRunningTime="2026-01-25 05:54:32.474241595 +0000 UTC m=+963.510119575" watchObservedRunningTime="2026-01-25 05:54:32.478264812 +0000 UTC m=+963.514142792" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.485407 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-config-data\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.485886 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-config-data-custom\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.486177 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-combined-ca-bundle\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.486308 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-config-data-custom\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.487069 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd2e0efe-0434-4103-a08d-d014f69addf6-config-data\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.492696 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj2bn\" (UniqueName: \"kubernetes.io/projected/fd2e0efe-0434-4103-a08d-d014f69addf6-kube-api-access-tj2bn\") pod \"barbican-worker-7b49f9f9b7-2ld95\" (UID: \"fd2e0efe-0434-4103-a08d-d014f69addf6\") " pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.496416 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktr8\" (UniqueName: \"kubernetes.io/projected/4564c893-fa22-47c0-92b9-4d503b3553ee-kube-api-access-dktr8\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.501935 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69tdf\" (UniqueName: \"kubernetes.io/projected/2d6794a2-d312-4727-9196-d030dea17b67-kube-api-access-69tdf\") pod \"dnsmasq-dns-755865797c-j8rm6\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.502830 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4564c893-fa22-47c0-92b9-4d503b3553ee-combined-ca-bundle\") pod \"barbican-keystone-listener-5b587b754-6vhmj\" (UID: \"4564c893-fa22-47c0-92b9-4d503b3553ee\") " pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.570674 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-combined-ca-bundle\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.570742 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.570839 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42x2k\" (UniqueName: \"kubernetes.io/projected/d6afcd9c-55bf-4a07-b8bf-311d452bc324-kube-api-access-42x2k\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.570883 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data-custom\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.570940 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6afcd9c-55bf-4a07-b8bf-311d452bc324-logs\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.572130 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6afcd9c-55bf-4a07-b8bf-311d452bc324-logs\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.578804 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data-custom\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.579968 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-combined-ca-bundle\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.580545 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.590497 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.595183 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42x2k\" (UniqueName: \"kubernetes.io/projected/d6afcd9c-55bf-4a07-b8bf-311d452bc324-kube-api-access-42x2k\") pod \"barbican-api-5479665bfd-x264b\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.608453 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.616706 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b49f9f9b7-2ld95" Jan 25 05:54:32 crc kubenswrapper[4728]: I0125 05:54:32.727301 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.044612 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-755865797c-j8rm6"] Jan 25 05:54:33 crc kubenswrapper[4728]: W0125 05:54:33.063969 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d6794a2_d312_4727_9196_d030dea17b67.slice/crio-62ffc3be0a56572de6a691ff14f8b25adc82dee9035417d7a11bb1a639a48a7c WatchSource:0}: Error finding container 62ffc3be0a56572de6a691ff14f8b25adc82dee9035417d7a11bb1a639a48a7c: Status 404 returned error can't find the container with id 62ffc3be0a56572de6a691ff14f8b25adc82dee9035417d7a11bb1a639a48a7c Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.172436 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b49f9f9b7-2ld95"] Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.179011 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b587b754-6vhmj"] Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.316976 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5479665bfd-x264b"] Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.339357 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d23e17-65a8-4719-a9ca-a69f392fdbf3" path="/var/lib/kubelet/pods/44d23e17-65a8-4719-a9ca-a69f392fdbf3/volumes" Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.474363 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5479665bfd-x264b" event={"ID":"d6afcd9c-55bf-4a07-b8bf-311d452bc324","Type":"ContainerStarted","Data":"0295631be9b1a80b65bb9b7630b9c9394f86268e32b1076acb2b2f894161e66e"} Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.478069 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b49f9f9b7-2ld95" event={"ID":"fd2e0efe-0434-4103-a08d-d014f69addf6","Type":"ContainerStarted","Data":"7212cadd707efac44d789cb06b7bc931022a245985f32a626609f9783f818222"} Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.480008 4728 generic.go:334] "Generic (PLEG): container finished" podID="2d6794a2-d312-4727-9196-d030dea17b67" containerID="9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34" exitCode=0 Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.480050 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-755865797c-j8rm6" event={"ID":"2d6794a2-d312-4727-9196-d030dea17b67","Type":"ContainerDied","Data":"9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34"} Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.480089 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-755865797c-j8rm6" event={"ID":"2d6794a2-d312-4727-9196-d030dea17b67","Type":"ContainerStarted","Data":"62ffc3be0a56572de6a691ff14f8b25adc82dee9035417d7a11bb1a639a48a7c"} Jan 25 05:54:33 crc kubenswrapper[4728]: I0125 05:54:33.482254 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" event={"ID":"4564c893-fa22-47c0-92b9-4d503b3553ee","Type":"ContainerStarted","Data":"38ae544f15da77c0f50cd794cfb4490a4831ed208607f4ccd21f309a816180b9"} Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.499155 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5479665bfd-x264b" event={"ID":"d6afcd9c-55bf-4a07-b8bf-311d452bc324","Type":"ContainerStarted","Data":"02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65"} Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.499235 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5479665bfd-x264b" event={"ID":"d6afcd9c-55bf-4a07-b8bf-311d452bc324","Type":"ContainerStarted","Data":"f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa"} Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.499395 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.499430 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.504817 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-755865797c-j8rm6" event={"ID":"2d6794a2-d312-4727-9196-d030dea17b67","Type":"ContainerStarted","Data":"d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2"} Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.505212 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.524491 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5479665bfd-x264b" podStartSLOduration=2.5244788959999998 podStartE2EDuration="2.524478896s" podCreationTimestamp="2026-01-25 05:54:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:34.513760295 +0000 UTC m=+965.549638275" watchObservedRunningTime="2026-01-25 05:54:34.524478896 +0000 UTC m=+965.560356875" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.537709 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-755865797c-j8rm6" podStartSLOduration=2.537685599 podStartE2EDuration="2.537685599s" podCreationTimestamp="2026-01-25 05:54:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:34.531994438 +0000 UTC m=+965.567872418" watchObservedRunningTime="2026-01-25 05:54:34.537685599 +0000 UTC m=+965.573563578" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.698461 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b8466696b-psmv9"] Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.700108 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.702220 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.702388 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.707853 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b8466696b-psmv9"] Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.825017 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36477670-fd4c-4015-8fab-b7608c72a906-logs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.825068 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-config-data\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.825104 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-combined-ca-bundle\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.825188 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-public-tls-certs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.825211 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2nq\" (UniqueName: \"kubernetes.io/projected/36477670-fd4c-4015-8fab-b7608c72a906-kube-api-access-lv2nq\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.825227 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-config-data-custom\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.825242 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-internal-tls-certs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.927158 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-public-tls-certs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.927202 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2nq\" (UniqueName: \"kubernetes.io/projected/36477670-fd4c-4015-8fab-b7608c72a906-kube-api-access-lv2nq\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.927221 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-config-data-custom\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.927239 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-internal-tls-certs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.927303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36477670-fd4c-4015-8fab-b7608c72a906-logs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.927341 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-config-data\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.927373 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-combined-ca-bundle\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.931100 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36477670-fd4c-4015-8fab-b7608c72a906-logs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.933034 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-combined-ca-bundle\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.933584 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-public-tls-certs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.934733 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-internal-tls-certs\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.935701 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-config-data\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.943224 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36477670-fd4c-4015-8fab-b7608c72a906-config-data-custom\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:34 crc kubenswrapper[4728]: I0125 05:54:34.943643 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2nq\" (UniqueName: \"kubernetes.io/projected/36477670-fd4c-4015-8fab-b7608c72a906-kube-api-access-lv2nq\") pod \"barbican-api-6b8466696b-psmv9\" (UID: \"36477670-fd4c-4015-8fab-b7608c72a906\") " pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:35 crc kubenswrapper[4728]: I0125 05:54:35.017843 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:35 crc kubenswrapper[4728]: I0125 05:54:35.415520 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b8466696b-psmv9"] Jan 25 05:54:35 crc kubenswrapper[4728]: I0125 05:54:35.514090 4728 generic.go:334] "Generic (PLEG): container finished" podID="74d5365c-76ac-4544-b1e2-ae442ee191dd" containerID="71c95086baeef57765b0e9204bbd38d6bb79ab580e92ae66a96331630c9baeb2" exitCode=0 Jan 25 05:54:35 crc kubenswrapper[4728]: I0125 05:54:35.514794 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-csg7p" event={"ID":"74d5365c-76ac-4544-b1e2-ae442ee191dd","Type":"ContainerDied","Data":"71c95086baeef57765b0e9204bbd38d6bb79ab580e92ae66a96331630c9baeb2"} Jan 25 05:54:37 crc kubenswrapper[4728]: W0125 05:54:37.191640 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36477670_fd4c_4015_8fab_b7608c72a906.slice/crio-88b0034122d093f801ee302dae634d48612d058227471dbdc16033910fad1ae6 WatchSource:0}: Error finding container 88b0034122d093f801ee302dae634d48612d058227471dbdc16033910fad1ae6: Status 404 returned error can't find the container with id 88b0034122d093f801ee302dae634d48612d058227471dbdc16033910fad1ae6 Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.242031 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.369342 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-db-sync-config-data\") pod \"74d5365c-76ac-4544-b1e2-ae442ee191dd\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.369427 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-scripts\") pod \"74d5365c-76ac-4544-b1e2-ae442ee191dd\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.369480 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-config-data\") pod \"74d5365c-76ac-4544-b1e2-ae442ee191dd\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.369528 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74d5365c-76ac-4544-b1e2-ae442ee191dd-etc-machine-id\") pod \"74d5365c-76ac-4544-b1e2-ae442ee191dd\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.369561 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-combined-ca-bundle\") pod \"74d5365c-76ac-4544-b1e2-ae442ee191dd\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.369611 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w5p8\" (UniqueName: \"kubernetes.io/projected/74d5365c-76ac-4544-b1e2-ae442ee191dd-kube-api-access-6w5p8\") pod \"74d5365c-76ac-4544-b1e2-ae442ee191dd\" (UID: \"74d5365c-76ac-4544-b1e2-ae442ee191dd\") " Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.369817 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d5365c-76ac-4544-b1e2-ae442ee191dd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "74d5365c-76ac-4544-b1e2-ae442ee191dd" (UID: "74d5365c-76ac-4544-b1e2-ae442ee191dd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.370094 4728 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74d5365c-76ac-4544-b1e2-ae442ee191dd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.376622 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "74d5365c-76ac-4544-b1e2-ae442ee191dd" (UID: "74d5365c-76ac-4544-b1e2-ae442ee191dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.379994 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-scripts" (OuterVolumeSpecName: "scripts") pod "74d5365c-76ac-4544-b1e2-ae442ee191dd" (UID: "74d5365c-76ac-4544-b1e2-ae442ee191dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.380186 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d5365c-76ac-4544-b1e2-ae442ee191dd-kube-api-access-6w5p8" (OuterVolumeSpecName: "kube-api-access-6w5p8") pod "74d5365c-76ac-4544-b1e2-ae442ee191dd" (UID: "74d5365c-76ac-4544-b1e2-ae442ee191dd"). InnerVolumeSpecName "kube-api-access-6w5p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.404823 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74d5365c-76ac-4544-b1e2-ae442ee191dd" (UID: "74d5365c-76ac-4544-b1e2-ae442ee191dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.415779 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-config-data" (OuterVolumeSpecName: "config-data") pod "74d5365c-76ac-4544-b1e2-ae442ee191dd" (UID: "74d5365c-76ac-4544-b1e2-ae442ee191dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.471675 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.471705 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.471719 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w5p8\" (UniqueName: \"kubernetes.io/projected/74d5365c-76ac-4544-b1e2-ae442ee191dd-kube-api-access-6w5p8\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.471729 4728 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.471737 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74d5365c-76ac-4544-b1e2-ae442ee191dd-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.528865 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b8466696b-psmv9" event={"ID":"36477670-fd4c-4015-8fab-b7608c72a906","Type":"ContainerStarted","Data":"88b0034122d093f801ee302dae634d48612d058227471dbdc16033910fad1ae6"} Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.530591 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-csg7p" event={"ID":"74d5365c-76ac-4544-b1e2-ae442ee191dd","Type":"ContainerDied","Data":"3fcd1b578ab4fe91fdc5e10396004dc08ababf04ff2de627fbef6aaa1902e2aa"} Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.530618 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fcd1b578ab4fe91fdc5e10396004dc08ababf04ff2de627fbef6aaa1902e2aa" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.530667 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-csg7p" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.732404 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:37 crc kubenswrapper[4728]: E0125 05:54:37.732767 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d5365c-76ac-4544-b1e2-ae442ee191dd" containerName="cinder-db-sync" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.732780 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d5365c-76ac-4544-b1e2-ae442ee191dd" containerName="cinder-db-sync" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.732949 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d5365c-76ac-4544-b1e2-ae442ee191dd" containerName="cinder-db-sync" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.733760 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.743886 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.744362 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.744411 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.744662 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jhblj" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.750168 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.775801 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.775846 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-scripts\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.775876 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.775898 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.775927 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxz8z\" (UniqueName: \"kubernetes.io/projected/e22486ac-1aae-4d93-9c43-8262f94761ff-kube-api-access-kxz8z\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.775962 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e22486ac-1aae-4d93-9c43-8262f94761ff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.794721 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-755865797c-j8rm6"] Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.795200 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-755865797c-j8rm6" podUID="2d6794a2-d312-4727-9196-d030dea17b67" containerName="dnsmasq-dns" containerID="cri-o://d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2" gracePeriod=10 Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.832633 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb49f8487-rfk6x"] Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.836713 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.855590 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb49f8487-rfk6x"] Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878196 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2rw\" (UniqueName: \"kubernetes.io/projected/c05bf129-98d0-4259-b8f8-f86b1e68b084-kube-api-access-rf2rw\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878260 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878307 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878341 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-scripts\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878363 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878388 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878430 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-config\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878458 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxz8z\" (UniqueName: \"kubernetes.io/projected/e22486ac-1aae-4d93-9c43-8262f94761ff-kube-api-access-kxz8z\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878498 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878549 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e22486ac-1aae-4d93-9c43-8262f94761ff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878629 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-svc\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.878704 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.882937 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e22486ac-1aae-4d93-9c43-8262f94761ff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.885269 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.885754 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.891467 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.895722 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-scripts\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.898807 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxz8z\" (UniqueName: \"kubernetes.io/projected/e22486ac-1aae-4d93-9c43-8262f94761ff-kube-api-access-kxz8z\") pod \"cinder-scheduler-0\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.955012 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.957124 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.959099 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.989881 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.992929 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-config\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.993016 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.993112 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-svc\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.993184 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.993281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2rw\" (UniqueName: \"kubernetes.io/projected/c05bf129-98d0-4259-b8f8-f86b1e68b084-kube-api-access-rf2rw\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.994126 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.994168 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-config\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.994708 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.995561 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-svc\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.996215 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:37 crc kubenswrapper[4728]: I0125 05:54:37.998989 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.009652 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2rw\" (UniqueName: \"kubernetes.io/projected/c05bf129-98d0-4259-b8f8-f86b1e68b084-kube-api-access-rf2rw\") pod \"dnsmasq-dns-6cb49f8487-rfk6x\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.070097 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.096676 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-scripts\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.096775 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.096802 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc41a951-4159-45ba-a6d3-99b4591f2210-logs\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.096821 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.096839 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zq9x\" (UniqueName: \"kubernetes.io/projected/cc41a951-4159-45ba-a6d3-99b4591f2210-kube-api-access-9zq9x\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.096888 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc41a951-4159-45ba-a6d3-99b4591f2210-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.096941 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data-custom\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.166928 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.198891 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data-custom\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.198985 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-scripts\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.199217 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.199232 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc41a951-4159-45ba-a6d3-99b4591f2210-logs\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.199249 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.199270 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zq9x\" (UniqueName: \"kubernetes.io/projected/cc41a951-4159-45ba-a6d3-99b4591f2210-kube-api-access-9zq9x\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.199310 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc41a951-4159-45ba-a6d3-99b4591f2210-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.199413 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc41a951-4159-45ba-a6d3-99b4591f2210-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.200847 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc41a951-4159-45ba-a6d3-99b4591f2210-logs\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.205163 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.217824 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data-custom\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.222602 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.228991 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-scripts\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.234173 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zq9x\" (UniqueName: \"kubernetes.io/projected/cc41a951-4159-45ba-a6d3-99b4591f2210-kube-api-access-9zq9x\") pod \"cinder-api-0\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.290959 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.381175 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.504187 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-svc\") pod \"2d6794a2-d312-4727-9196-d030dea17b67\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.504598 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-swift-storage-0\") pod \"2d6794a2-d312-4727-9196-d030dea17b67\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.504637 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69tdf\" (UniqueName: \"kubernetes.io/projected/2d6794a2-d312-4727-9196-d030dea17b67-kube-api-access-69tdf\") pod \"2d6794a2-d312-4727-9196-d030dea17b67\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.504705 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-sb\") pod \"2d6794a2-d312-4727-9196-d030dea17b67\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.504823 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-nb\") pod \"2d6794a2-d312-4727-9196-d030dea17b67\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.504851 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-config\") pod \"2d6794a2-d312-4727-9196-d030dea17b67\" (UID: \"2d6794a2-d312-4727-9196-d030dea17b67\") " Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.509881 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6794a2-d312-4727-9196-d030dea17b67-kube-api-access-69tdf" (OuterVolumeSpecName: "kube-api-access-69tdf") pod "2d6794a2-d312-4727-9196-d030dea17b67" (UID: "2d6794a2-d312-4727-9196-d030dea17b67"). InnerVolumeSpecName "kube-api-access-69tdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.554583 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2d6794a2-d312-4727-9196-d030dea17b67" (UID: "2d6794a2-d312-4727-9196-d030dea17b67"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.563675 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.565600 4728 generic.go:334] "Generic (PLEG): container finished" podID="2d6794a2-d312-4727-9196-d030dea17b67" containerID="d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2" exitCode=0 Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.565694 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-755865797c-j8rm6" event={"ID":"2d6794a2-d312-4727-9196-d030dea17b67","Type":"ContainerDied","Data":"d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2"} Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.565725 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-755865797c-j8rm6" event={"ID":"2d6794a2-d312-4727-9196-d030dea17b67","Type":"ContainerDied","Data":"62ffc3be0a56572de6a691ff14f8b25adc82dee9035417d7a11bb1a639a48a7c"} Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.565743 4728 scope.go:117] "RemoveContainer" containerID="d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.565892 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-755865797c-j8rm6" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.566672 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d6794a2-d312-4727-9196-d030dea17b67" (UID: "2d6794a2-d312-4727-9196-d030dea17b67"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:38 crc kubenswrapper[4728]: W0125 05:54:38.571928 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode22486ac_1aae_4d93_9c43_8262f94761ff.slice/crio-587841ac3cb954286d81893d66d23e897a3b6b65ea488a344a02bc1925c4e885 WatchSource:0}: Error finding container 587841ac3cb954286d81893d66d23e897a3b6b65ea488a344a02bc1925c4e885: Status 404 returned error can't find the container with id 587841ac3cb954286d81893d66d23e897a3b6b65ea488a344a02bc1925c4e885 Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.578287 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b8466696b-psmv9" event={"ID":"36477670-fd4c-4015-8fab-b7608c72a906","Type":"ContainerStarted","Data":"d554a1bb691c33c3f796e19c929c659d16afdd7fd9cc0521c1c0c06323ea837a"} Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.578362 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b8466696b-psmv9" event={"ID":"36477670-fd4c-4015-8fab-b7608c72a906","Type":"ContainerStarted","Data":"473c88bc86e75a3f82cd9768f8192408fe55805f7ccbdfd80a06a0db82f4f06e"} Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.578721 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.578768 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.579264 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d6794a2-d312-4727-9196-d030dea17b67" (UID: "2d6794a2-d312-4727-9196-d030dea17b67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.579344 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-config" (OuterVolumeSpecName: "config") pod "2d6794a2-d312-4727-9196-d030dea17b67" (UID: "2d6794a2-d312-4727-9196-d030dea17b67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.581749 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d6794a2-d312-4727-9196-d030dea17b67" (UID: "2d6794a2-d312-4727-9196-d030dea17b67"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.608782 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.608810 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.608820 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69tdf\" (UniqueName: \"kubernetes.io/projected/2d6794a2-d312-4727-9196-d030dea17b67-kube-api-access-69tdf\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.608830 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.608839 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.608848 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6794a2-d312-4727-9196-d030dea17b67-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.613992 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" event={"ID":"4564c893-fa22-47c0-92b9-4d503b3553ee","Type":"ContainerStarted","Data":"5fd093d83eba7fd416d8236f3e4835a0976a992de8e160b96df5b0f505692690"} Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.614047 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" event={"ID":"4564c893-fa22-47c0-92b9-4d503b3553ee","Type":"ContainerStarted","Data":"5d2473cbcd0e274343fc39217686826b71cfb05add11babd17415b25ddea810a"} Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.603254 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b8466696b-psmv9" podStartSLOduration=4.603228681 podStartE2EDuration="4.603228681s" podCreationTimestamp="2026-01-25 05:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:38.595280074 +0000 UTC m=+969.631158044" watchObservedRunningTime="2026-01-25 05:54:38.603228681 +0000 UTC m=+969.639106662" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.624883 4728 scope.go:117] "RemoveContainer" containerID="9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.640108 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b49f9f9b7-2ld95" event={"ID":"fd2e0efe-0434-4103-a08d-d014f69addf6","Type":"ContainerStarted","Data":"57d0829d412c09180d6c0a18407527ae6c9f82ff5feef327595d97fc388793eb"} Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.640455 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b49f9f9b7-2ld95" event={"ID":"fd2e0efe-0434-4103-a08d-d014f69addf6","Type":"ContainerStarted","Data":"21d3fe3724f6ca73826a6d27d9744cd74836425d43c235bbd65791f8a84279b1"} Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.656471 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b587b754-6vhmj" podStartSLOduration=2.361005337 podStartE2EDuration="6.6564513s" podCreationTimestamp="2026-01-25 05:54:32 +0000 UTC" firstStartedPulling="2026-01-25 05:54:33.193108686 +0000 UTC m=+964.228986656" lastFinishedPulling="2026-01-25 05:54:37.488554629 +0000 UTC m=+968.524432619" observedRunningTime="2026-01-25 05:54:38.637566019 +0000 UTC m=+969.673444019" watchObservedRunningTime="2026-01-25 05:54:38.6564513 +0000 UTC m=+969.692329270" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.665716 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b49f9f9b7-2ld95" podStartSLOduration=2.362486039 podStartE2EDuration="6.665694208s" podCreationTimestamp="2026-01-25 05:54:32 +0000 UTC" firstStartedPulling="2026-01-25 05:54:33.192794344 +0000 UTC m=+964.228672324" lastFinishedPulling="2026-01-25 05:54:37.496002513 +0000 UTC m=+968.531880493" observedRunningTime="2026-01-25 05:54:38.662343359 +0000 UTC m=+969.698221340" watchObservedRunningTime="2026-01-25 05:54:38.665694208 +0000 UTC m=+969.701572188" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.691018 4728 scope.go:117] "RemoveContainer" containerID="d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2" Jan 25 05:54:38 crc kubenswrapper[4728]: E0125 05:54:38.691394 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2\": container with ID starting with d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2 not found: ID does not exist" containerID="d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.691426 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2"} err="failed to get container status \"d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2\": rpc error: code = NotFound desc = could not find container \"d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2\": container with ID starting with d2192df1da5204b4a9905dc8f331fb8a14815a26b9ffa6281b537d76023801c2 not found: ID does not exist" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.691452 4728 scope.go:117] "RemoveContainer" containerID="9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.692110 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerStarted","Data":"2667be4838d6d35f33cfb0287b1f874e6e735b949eaa68e4656dcaaa5885f92c"} Jan 25 05:54:38 crc kubenswrapper[4728]: E0125 05:54:38.692162 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34\": container with ID starting with 9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34 not found: ID does not exist" containerID="9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.692177 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34"} err="failed to get container status \"9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34\": rpc error: code = NotFound desc = could not find container \"9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34\": container with ID starting with 9f4cf5de27615cdb44a205948bee292caea3f123c66f62c4eb55d5c531bfcc34 not found: ID does not exist" Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.753576 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb49f8487-rfk6x"] Jan 25 05:54:38 crc kubenswrapper[4728]: W0125 05:54:38.754263 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc05bf129_98d0_4259_b8f8_f86b1e68b084.slice/crio-c867656b4c572559f4614fe953085d7839565c880416addb14bf5688db42f353 WatchSource:0}: Error finding container c867656b4c572559f4614fe953085d7839565c880416addb14bf5688db42f353: Status 404 returned error can't find the container with id c867656b4c572559f4614fe953085d7839565c880416addb14bf5688db42f353 Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.851793 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.917960 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-755865797c-j8rm6"] Jan 25 05:54:38 crc kubenswrapper[4728]: I0125 05:54:38.926202 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-755865797c-j8rm6"] Jan 25 05:54:39 crc kubenswrapper[4728]: I0125 05:54:39.374206 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6794a2-d312-4727-9196-d030dea17b67" path="/var/lib/kubelet/pods/2d6794a2-d312-4727-9196-d030dea17b67/volumes" Jan 25 05:54:39 crc kubenswrapper[4728]: I0125 05:54:39.719647 4728 generic.go:334] "Generic (PLEG): container finished" podID="c05bf129-98d0-4259-b8f8-f86b1e68b084" containerID="17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3" exitCode=0 Jan 25 05:54:39 crc kubenswrapper[4728]: I0125 05:54:39.720008 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" event={"ID":"c05bf129-98d0-4259-b8f8-f86b1e68b084","Type":"ContainerDied","Data":"17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3"} Jan 25 05:54:39 crc kubenswrapper[4728]: I0125 05:54:39.720045 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" event={"ID":"c05bf129-98d0-4259-b8f8-f86b1e68b084","Type":"ContainerStarted","Data":"c867656b4c572559f4614fe953085d7839565c880416addb14bf5688db42f353"} Jan 25 05:54:39 crc kubenswrapper[4728]: I0125 05:54:39.724972 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:39 crc kubenswrapper[4728]: I0125 05:54:39.734577 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc41a951-4159-45ba-a6d3-99b4591f2210","Type":"ContainerStarted","Data":"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472"} Jan 25 05:54:39 crc kubenswrapper[4728]: I0125 05:54:39.734624 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc41a951-4159-45ba-a6d3-99b4591f2210","Type":"ContainerStarted","Data":"f6caf628592d1f8e3eab3444f17f5144e6f03d72ae5530d4759765f6c16cfd32"} Jan 25 05:54:39 crc kubenswrapper[4728]: I0125 05:54:39.744862 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e22486ac-1aae-4d93-9c43-8262f94761ff","Type":"ContainerStarted","Data":"587841ac3cb954286d81893d66d23e897a3b6b65ea488a344a02bc1925c4e885"} Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.787117 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e22486ac-1aae-4d93-9c43-8262f94761ff","Type":"ContainerStarted","Data":"3b4788b39814e2c330813eb84ec97a6bb04229060dfef1b0ecaf003c272d21ad"} Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.831496 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" event={"ID":"c05bf129-98d0-4259-b8f8-f86b1e68b084","Type":"ContainerStarted","Data":"73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc"} Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.832672 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.847219 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc41a951-4159-45ba-a6d3-99b4591f2210","Type":"ContainerStarted","Data":"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f"} Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.847380 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerName="cinder-api-log" containerID="cri-o://199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472" gracePeriod=30 Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.847641 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.847679 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerName="cinder-api" containerID="cri-o://5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f" gracePeriod=30 Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.894158 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.894148088 podStartE2EDuration="3.894148088s" podCreationTimestamp="2026-01-25 05:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:40.888723709 +0000 UTC m=+971.924601689" watchObservedRunningTime="2026-01-25 05:54:40.894148088 +0000 UTC m=+971.930026068" Jan 25 05:54:40 crc kubenswrapper[4728]: I0125 05:54:40.895113 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" podStartSLOduration=3.895106555 podStartE2EDuration="3.895106555s" podCreationTimestamp="2026-01-25 05:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:40.866643063 +0000 UTC m=+971.902521043" watchObservedRunningTime="2026-01-25 05:54:40.895106555 +0000 UTC m=+971.930984535" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.445639 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.490392 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-scripts\") pod \"cc41a951-4159-45ba-a6d3-99b4591f2210\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.490577 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data-custom\") pod \"cc41a951-4159-45ba-a6d3-99b4591f2210\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.490631 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-combined-ca-bundle\") pod \"cc41a951-4159-45ba-a6d3-99b4591f2210\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.490682 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zq9x\" (UniqueName: \"kubernetes.io/projected/cc41a951-4159-45ba-a6d3-99b4591f2210-kube-api-access-9zq9x\") pod \"cc41a951-4159-45ba-a6d3-99b4591f2210\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.490706 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc41a951-4159-45ba-a6d3-99b4591f2210-logs\") pod \"cc41a951-4159-45ba-a6d3-99b4591f2210\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.490752 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc41a951-4159-45ba-a6d3-99b4591f2210-etc-machine-id\") pod \"cc41a951-4159-45ba-a6d3-99b4591f2210\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.490886 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data\") pod \"cc41a951-4159-45ba-a6d3-99b4591f2210\" (UID: \"cc41a951-4159-45ba-a6d3-99b4591f2210\") " Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.491486 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc41a951-4159-45ba-a6d3-99b4591f2210-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cc41a951-4159-45ba-a6d3-99b4591f2210" (UID: "cc41a951-4159-45ba-a6d3-99b4591f2210"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.492111 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc41a951-4159-45ba-a6d3-99b4591f2210-logs" (OuterVolumeSpecName: "logs") pod "cc41a951-4159-45ba-a6d3-99b4591f2210" (UID: "cc41a951-4159-45ba-a6d3-99b4591f2210"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.493400 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc41a951-4159-45ba-a6d3-99b4591f2210-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.493414 4728 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc41a951-4159-45ba-a6d3-99b4591f2210-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.498446 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cc41a951-4159-45ba-a6d3-99b4591f2210" (UID: "cc41a951-4159-45ba-a6d3-99b4591f2210"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.501434 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc41a951-4159-45ba-a6d3-99b4591f2210-kube-api-access-9zq9x" (OuterVolumeSpecName: "kube-api-access-9zq9x") pod "cc41a951-4159-45ba-a6d3-99b4591f2210" (UID: "cc41a951-4159-45ba-a6d3-99b4591f2210"). InnerVolumeSpecName "kube-api-access-9zq9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.509827 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-scripts" (OuterVolumeSpecName: "scripts") pod "cc41a951-4159-45ba-a6d3-99b4591f2210" (UID: "cc41a951-4159-45ba-a6d3-99b4591f2210"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.517110 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc41a951-4159-45ba-a6d3-99b4591f2210" (UID: "cc41a951-4159-45ba-a6d3-99b4591f2210"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.535486 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data" (OuterVolumeSpecName: "config-data") pod "cc41a951-4159-45ba-a6d3-99b4591f2210" (UID: "cc41a951-4159-45ba-a6d3-99b4591f2210"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.595503 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.595545 4728 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.595557 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.595567 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zq9x\" (UniqueName: \"kubernetes.io/projected/cc41a951-4159-45ba-a6d3-99b4591f2210-kube-api-access-9zq9x\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.595576 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc41a951-4159-45ba-a6d3-99b4591f2210-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.855920 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e22486ac-1aae-4d93-9c43-8262f94761ff","Type":"ContainerStarted","Data":"f6bb74a208461c725dfbc6af1eaaddea16dd114ab2552a9f85559a7b52ce3df0"} Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.862605 4728 generic.go:334] "Generic (PLEG): container finished" podID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerID="5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f" exitCode=0 Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.862638 4728 generic.go:334] "Generic (PLEG): container finished" podID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerID="199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472" exitCode=143 Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.862648 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.862696 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc41a951-4159-45ba-a6d3-99b4591f2210","Type":"ContainerDied","Data":"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f"} Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.862729 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc41a951-4159-45ba-a6d3-99b4591f2210","Type":"ContainerDied","Data":"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472"} Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.862741 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cc41a951-4159-45ba-a6d3-99b4591f2210","Type":"ContainerDied","Data":"f6caf628592d1f8e3eab3444f17f5144e6f03d72ae5530d4759765f6c16cfd32"} Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.862759 4728 scope.go:117] "RemoveContainer" containerID="5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.877395 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.800604239 podStartE2EDuration="4.877380581s" podCreationTimestamp="2026-01-25 05:54:37 +0000 UTC" firstStartedPulling="2026-01-25 05:54:38.601395666 +0000 UTC m=+969.637273646" lastFinishedPulling="2026-01-25 05:54:39.678172008 +0000 UTC m=+970.714049988" observedRunningTime="2026-01-25 05:54:41.873860624 +0000 UTC m=+972.909738623" watchObservedRunningTime="2026-01-25 05:54:41.877380581 +0000 UTC m=+972.913258561" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.909117 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.922101 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.933584 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:41 crc kubenswrapper[4728]: E0125 05:54:41.933975 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerName="cinder-api" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.933997 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerName="cinder-api" Jan 25 05:54:41 crc kubenswrapper[4728]: E0125 05:54:41.934010 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6794a2-d312-4727-9196-d030dea17b67" containerName="dnsmasq-dns" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.934017 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6794a2-d312-4727-9196-d030dea17b67" containerName="dnsmasq-dns" Jan 25 05:54:41 crc kubenswrapper[4728]: E0125 05:54:41.934041 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6794a2-d312-4727-9196-d030dea17b67" containerName="init" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.934047 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6794a2-d312-4727-9196-d030dea17b67" containerName="init" Jan 25 05:54:41 crc kubenswrapper[4728]: E0125 05:54:41.934061 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerName="cinder-api-log" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.934067 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerName="cinder-api-log" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.934226 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerName="cinder-api" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.934243 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" containerName="cinder-api-log" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.934257 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6794a2-d312-4727-9196-d030dea17b67" containerName="dnsmasq-dns" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.935196 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.942038 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.942225 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.942358 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 25 05:54:41 crc kubenswrapper[4728]: I0125 05:54:41.980173 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012402 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-config-data\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012449 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012485 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-scripts\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012513 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012544 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012570 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x562h\" (UniqueName: \"kubernetes.io/projected/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-kube-api-access-x562h\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012586 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012602 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-logs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.012627 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.115843 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x562h\" (UniqueName: \"kubernetes.io/projected/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-kube-api-access-x562h\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.115946 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.115987 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-logs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.116092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.116275 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.116553 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-logs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.116585 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-config-data\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.116658 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.116775 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-scripts\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.116863 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.117130 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.120836 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.121256 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.121334 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.121869 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-config-data\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.122528 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.123984 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-scripts\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.130175 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x562h\" (UniqueName: \"kubernetes.io/projected/b9e0ee7f-ce38-4e4f-afe0-993551ae84a8-kube-api-access-x562h\") pod \"cinder-api-0\" (UID: \"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8\") " pod="openstack/cinder-api-0" Jan 25 05:54:42 crc kubenswrapper[4728]: I0125 05:54:42.277070 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 25 05:54:43 crc kubenswrapper[4728]: I0125 05:54:43.071642 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 25 05:54:43 crc kubenswrapper[4728]: I0125 05:54:43.337459 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc41a951-4159-45ba-a6d3-99b4591f2210" path="/var/lib/kubelet/pods/cc41a951-4159-45ba-a6d3-99b4591f2210/volumes" Jan 25 05:54:43 crc kubenswrapper[4728]: I0125 05:54:43.865715 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:43 crc kubenswrapper[4728]: I0125 05:54:43.898828 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:44 crc kubenswrapper[4728]: I0125 05:54:44.087313 4728 scope.go:117] "RemoveContainer" containerID="199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.013769 4728 scope.go:117] "RemoveContainer" containerID="5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f" Jan 25 05:54:45 crc kubenswrapper[4728]: E0125 05:54:45.014576 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f\": container with ID starting with 5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f not found: ID does not exist" containerID="5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.014629 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f"} err="failed to get container status \"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f\": rpc error: code = NotFound desc = could not find container \"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f\": container with ID starting with 5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f not found: ID does not exist" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.014665 4728 scope.go:117] "RemoveContainer" containerID="199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472" Jan 25 05:54:45 crc kubenswrapper[4728]: E0125 05:54:45.015175 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472\": container with ID starting with 199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472 not found: ID does not exist" containerID="199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.015210 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472"} err="failed to get container status \"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472\": rpc error: code = NotFound desc = could not find container \"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472\": container with ID starting with 199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472 not found: ID does not exist" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.015230 4728 scope.go:117] "RemoveContainer" containerID="5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.015649 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f"} err="failed to get container status \"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f\": rpc error: code = NotFound desc = could not find container \"5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f\": container with ID starting with 5eb34ba8eba8dc7bbba3780c3ce179927a343187245bdc4dbe5b591793351f1f not found: ID does not exist" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.015675 4728 scope.go:117] "RemoveContainer" containerID="199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.016089 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472"} err="failed to get container status \"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472\": rpc error: code = NotFound desc = could not find container \"199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472\": container with ID starting with 199cc2559c62733fe9e1a84f2c22cf15c4b7b554f315928c515df40f320ee472 not found: ID does not exist" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.442377 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 25 05:54:45 crc kubenswrapper[4728]: W0125 05:54:45.446460 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9e0ee7f_ce38_4e4f_afe0_993551ae84a8.slice/crio-79f7873f0c9dd483bea4e2788ab8e3da31fd355ff44ebf05b459601164b4cd8f WatchSource:0}: Error finding container 79f7873f0c9dd483bea4e2788ab8e3da31fd355ff44ebf05b459601164b4cd8f: Status 404 returned error can't find the container with id 79f7873f0c9dd483bea4e2788ab8e3da31fd355ff44ebf05b459601164b4cd8f Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.919874 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerStarted","Data":"d9da0cda1b001b678afb1d960f89bbd1684e577a11a61d698615e11932d0a053"} Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.920204 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="ceilometer-central-agent" containerID="cri-o://7d68cf169dbb0066570e7f0b26179fa51d8c9197dfb09193001d709d71b208a0" gracePeriod=30 Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.920487 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.920735 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="proxy-httpd" containerID="cri-o://d9da0cda1b001b678afb1d960f89bbd1684e577a11a61d698615e11932d0a053" gracePeriod=30 Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.920790 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="sg-core" containerID="cri-o://2667be4838d6d35f33cfb0287b1f874e6e735b949eaa68e4656dcaaa5885f92c" gracePeriod=30 Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.920823 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="ceilometer-notification-agent" containerID="cri-o://bb02b25ccc758d25fe4d3563f6da4e8d2e86e6d20c03953fa5e70c7b5ca0ae42" gracePeriod=30 Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.922939 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8","Type":"ContainerStarted","Data":"79f7873f0c9dd483bea4e2788ab8e3da31fd355ff44ebf05b459601164b4cd8f"} Jan 25 05:54:45 crc kubenswrapper[4728]: I0125 05:54:45.962821 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.049662452 podStartE2EDuration="42.962803346s" podCreationTimestamp="2026-01-25 05:54:03 +0000 UTC" firstStartedPulling="2026-01-25 05:54:04.151980796 +0000 UTC m=+935.187858776" lastFinishedPulling="2026-01-25 05:54:45.06512169 +0000 UTC m=+976.100999670" observedRunningTime="2026-01-25 05:54:45.952433802 +0000 UTC m=+976.988311781" watchObservedRunningTime="2026-01-25 05:54:45.962803346 +0000 UTC m=+976.998681326" Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.117416 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.172690 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b8466696b-psmv9" Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.237387 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5479665bfd-x264b"] Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.237704 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5479665bfd-x264b" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api-log" containerID="cri-o://f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa" gracePeriod=30 Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.237894 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5479665bfd-x264b" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api" containerID="cri-o://02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65" gracePeriod=30 Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.818871 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-55586495d8-v5qdm" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.149:9696/\": dial tcp 10.217.0.149:9696: connect: connection refused" Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.945702 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerID="f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa" exitCode=143 Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.945742 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5479665bfd-x264b" event={"ID":"d6afcd9c-55bf-4a07-b8bf-311d452bc324","Type":"ContainerDied","Data":"f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa"} Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.948670 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerID="d9da0cda1b001b678afb1d960f89bbd1684e577a11a61d698615e11932d0a053" exitCode=0 Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.948704 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerID="2667be4838d6d35f33cfb0287b1f874e6e735b949eaa68e4656dcaaa5885f92c" exitCode=2 Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.948713 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerID="bb02b25ccc758d25fe4d3563f6da4e8d2e86e6d20c03953fa5e70c7b5ca0ae42" exitCode=0 Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.948721 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerID="7d68cf169dbb0066570e7f0b26179fa51d8c9197dfb09193001d709d71b208a0" exitCode=0 Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.948753 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerDied","Data":"d9da0cda1b001b678afb1d960f89bbd1684e577a11a61d698615e11932d0a053"} Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.948798 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerDied","Data":"2667be4838d6d35f33cfb0287b1f874e6e735b949eaa68e4656dcaaa5885f92c"} Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.948811 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerDied","Data":"bb02b25ccc758d25fe4d3563f6da4e8d2e86e6d20c03953fa5e70c7b5ca0ae42"} Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.948829 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerDied","Data":"7d68cf169dbb0066570e7f0b26179fa51d8c9197dfb09193001d709d71b208a0"} Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.950569 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.952001 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8","Type":"ContainerStarted","Data":"e9009e85181aeb4a2aa3c18901531c951528765844b02a61bf710d80af170823"} Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.952036 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9e0ee7f-ce38-4e4f-afe0-993551ae84a8","Type":"ContainerStarted","Data":"a4c2012fa1cad4b9044cbe0b2be27de2e42bdd755db392b270af2811df4bc734"} Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.952154 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 25 05:54:46 crc kubenswrapper[4728]: I0125 05:54:46.987662 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.987649839 podStartE2EDuration="5.987649839s" podCreationTimestamp="2026-01-25 05:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:46.985616395 +0000 UTC m=+978.021494365" watchObservedRunningTime="2026-01-25 05:54:46.987649839 +0000 UTC m=+978.023527819" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.015818 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7rfn\" (UniqueName: \"kubernetes.io/projected/d6b831ac-72a1-4f11-95c8-b3ee47275501-kube-api-access-q7rfn\") pod \"d6b831ac-72a1-4f11-95c8-b3ee47275501\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.015876 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-run-httpd\") pod \"d6b831ac-72a1-4f11-95c8-b3ee47275501\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.015903 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-combined-ca-bundle\") pod \"d6b831ac-72a1-4f11-95c8-b3ee47275501\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.016027 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-log-httpd\") pod \"d6b831ac-72a1-4f11-95c8-b3ee47275501\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.016146 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-sg-core-conf-yaml\") pod \"d6b831ac-72a1-4f11-95c8-b3ee47275501\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.016172 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-scripts\") pod \"d6b831ac-72a1-4f11-95c8-b3ee47275501\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.016207 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-config-data\") pod \"d6b831ac-72a1-4f11-95c8-b3ee47275501\" (UID: \"d6b831ac-72a1-4f11-95c8-b3ee47275501\") " Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.021660 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d6b831ac-72a1-4f11-95c8-b3ee47275501" (UID: "d6b831ac-72a1-4f11-95c8-b3ee47275501"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.021980 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d6b831ac-72a1-4f11-95c8-b3ee47275501" (UID: "d6b831ac-72a1-4f11-95c8-b3ee47275501"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.026492 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-scripts" (OuterVolumeSpecName: "scripts") pod "d6b831ac-72a1-4f11-95c8-b3ee47275501" (UID: "d6b831ac-72a1-4f11-95c8-b3ee47275501"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.026895 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b831ac-72a1-4f11-95c8-b3ee47275501-kube-api-access-q7rfn" (OuterVolumeSpecName: "kube-api-access-q7rfn") pod "d6b831ac-72a1-4f11-95c8-b3ee47275501" (UID: "d6b831ac-72a1-4f11-95c8-b3ee47275501"). InnerVolumeSpecName "kube-api-access-q7rfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.054934 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d6b831ac-72a1-4f11-95c8-b3ee47275501" (UID: "d6b831ac-72a1-4f11-95c8-b3ee47275501"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.089832 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6b831ac-72a1-4f11-95c8-b3ee47275501" (UID: "d6b831ac-72a1-4f11-95c8-b3ee47275501"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.105701 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-config-data" (OuterVolumeSpecName: "config-data") pod "d6b831ac-72a1-4f11-95c8-b3ee47275501" (UID: "d6b831ac-72a1-4f11-95c8-b3ee47275501"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.118784 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7rfn\" (UniqueName: \"kubernetes.io/projected/d6b831ac-72a1-4f11-95c8-b3ee47275501-kube-api-access-q7rfn\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.118838 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.118849 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.118862 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6b831ac-72a1-4f11-95c8-b3ee47275501-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.118871 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.118879 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.118889 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b831ac-72a1-4f11-95c8-b3ee47275501-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.962124 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6b831ac-72a1-4f11-95c8-b3ee47275501","Type":"ContainerDied","Data":"3663d4d024db7e0849d4842ccdfeaccd60fddde19df6b9e621ca38c761d1ea1c"} Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.962377 4728 scope.go:117] "RemoveContainer" containerID="d9da0cda1b001b678afb1d960f89bbd1684e577a11a61d698615e11932d0a053" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.962190 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.970890 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.980195 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.983656 4728 scope.go:117] "RemoveContainer" containerID="2667be4838d6d35f33cfb0287b1f874e6e735b949eaa68e4656dcaaa5885f92c" Jan 25 05:54:47 crc kubenswrapper[4728]: I0125 05:54:47.988272 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.002525 4728 scope.go:117] "RemoveContainer" containerID="bb02b25ccc758d25fe4d3563f6da4e8d2e86e6d20c03953fa5e70c7b5ca0ae42" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.011125 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:48 crc kubenswrapper[4728]: E0125 05:54:48.011615 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="proxy-httpd" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.011640 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="proxy-httpd" Jan 25 05:54:48 crc kubenswrapper[4728]: E0125 05:54:48.011656 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="sg-core" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.011665 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="sg-core" Jan 25 05:54:48 crc kubenswrapper[4728]: E0125 05:54:48.011683 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="ceilometer-central-agent" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.011690 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="ceilometer-central-agent" Jan 25 05:54:48 crc kubenswrapper[4728]: E0125 05:54:48.011714 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="ceilometer-notification-agent" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.011720 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="ceilometer-notification-agent" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.011974 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="ceilometer-central-agent" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.011996 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="ceilometer-notification-agent" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.012007 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="proxy-httpd" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.012024 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" containerName="sg-core" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.013770 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.017019 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.017142 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.023300 4728 scope.go:117] "RemoveContainer" containerID="7d68cf169dbb0066570e7f0b26179fa51d8c9197dfb09193001d709d71b208a0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.023310 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.045338 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f74fbc68-hj87v" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.139298 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-scripts\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.139394 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckxrf\" (UniqueName: \"kubernetes.io/projected/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-kube-api-access-ckxrf\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.139422 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.139449 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.139482 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-run-httpd\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.139561 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-config-data\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.139593 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-log-httpd\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.169035 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.235132 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-975f74cf9-g4lbb"] Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.235473 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" podUID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" containerName="dnsmasq-dns" containerID="cri-o://a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6" gracePeriod=10 Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.241382 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckxrf\" (UniqueName: \"kubernetes.io/projected/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-kube-api-access-ckxrf\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.241433 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.241467 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.241514 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-run-httpd\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.241559 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-config-data\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.241582 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-log-httpd\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.241643 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-scripts\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.242995 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-run-httpd\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.244196 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-log-httpd\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.248153 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-config-data\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.248796 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.256453 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-scripts\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.256889 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckxrf\" (UniqueName: \"kubernetes.io/projected/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-kube-api-access-ckxrf\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.264178 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.309393 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.333478 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.364881 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.556442 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.648388 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.755098 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-swift-storage-0\") pod \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.755162 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-config\") pod \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.755265 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-nb\") pod \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.755294 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-sb\") pod \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.755384 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-svc\") pod \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.755492 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h9tw\" (UniqueName: \"kubernetes.io/projected/d7c8b502-0768-4226-aae0-f6e9f639cb9a-kube-api-access-7h9tw\") pod \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\" (UID: \"d7c8b502-0768-4226-aae0-f6e9f639cb9a\") " Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.760748 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c8b502-0768-4226-aae0-f6e9f639cb9a-kube-api-access-7h9tw" (OuterVolumeSpecName: "kube-api-access-7h9tw") pod "d7c8b502-0768-4226-aae0-f6e9f639cb9a" (UID: "d7c8b502-0768-4226-aae0-f6e9f639cb9a"). InnerVolumeSpecName "kube-api-access-7h9tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.799032 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.802777 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7c8b502-0768-4226-aae0-f6e9f639cb9a" (UID: "d7c8b502-0768-4226-aae0-f6e9f639cb9a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.808227 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7c8b502-0768-4226-aae0-f6e9f639cb9a" (UID: "d7c8b502-0768-4226-aae0-f6e9f639cb9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.809680 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-config" (OuterVolumeSpecName: "config") pod "d7c8b502-0768-4226-aae0-f6e9f639cb9a" (UID: "d7c8b502-0768-4226-aae0-f6e9f639cb9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.814156 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7c8b502-0768-4226-aae0-f6e9f639cb9a" (UID: "d7c8b502-0768-4226-aae0-f6e9f639cb9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.816663 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7c8b502-0768-4226-aae0-f6e9f639cb9a" (UID: "d7c8b502-0768-4226-aae0-f6e9f639cb9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.857522 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.857554 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h9tw\" (UniqueName: \"kubernetes.io/projected/d7c8b502-0768-4226-aae0-f6e9f639cb9a-kube-api-access-7h9tw\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.857567 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.857577 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.857585 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.857596 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c8b502-0768-4226-aae0-f6e9f639cb9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.972238 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerStarted","Data":"c0eac17da593a27684c639b98a19936ca733dcfeef871b35f89e53300f744e8a"} Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.974238 4728 generic.go:334] "Generic (PLEG): container finished" podID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" containerID="a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6" exitCode=0 Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.974312 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.974374 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" event={"ID":"d7c8b502-0768-4226-aae0-f6e9f639cb9a","Type":"ContainerDied","Data":"a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6"} Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.974404 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-975f74cf9-g4lbb" event={"ID":"d7c8b502-0768-4226-aae0-f6e9f639cb9a","Type":"ContainerDied","Data":"170a2a1df22a60660e997f28fa09a00dfa164234d9c81963bf289fa20eed733c"} Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.974421 4728 scope.go:117] "RemoveContainer" containerID="a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6" Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.974860 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerName="cinder-scheduler" containerID="cri-o://3b4788b39814e2c330813eb84ec97a6bb04229060dfef1b0ecaf003c272d21ad" gracePeriod=30 Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.974965 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerName="probe" containerID="cri-o://f6bb74a208461c725dfbc6af1eaaddea16dd114ab2552a9f85559a7b52ce3df0" gracePeriod=30 Jan 25 05:54:48 crc kubenswrapper[4728]: I0125 05:54:48.998959 4728 scope.go:117] "RemoveContainer" containerID="d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.000733 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-975f74cf9-g4lbb"] Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.007633 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-975f74cf9-g4lbb"] Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.030013 4728 scope.go:117] "RemoveContainer" containerID="a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6" Jan 25 05:54:49 crc kubenswrapper[4728]: E0125 05:54:49.030479 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6\": container with ID starting with a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6 not found: ID does not exist" containerID="a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.030511 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6"} err="failed to get container status \"a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6\": rpc error: code = NotFound desc = could not find container \"a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6\": container with ID starting with a9ab5295ca6e6509b53817d03f3fe52adfda4f9af25ae32aa97340ec3b1473a6 not found: ID does not exist" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.030531 4728 scope.go:117] "RemoveContainer" containerID="d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916" Jan 25 05:54:49 crc kubenswrapper[4728]: E0125 05:54:49.030862 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916\": container with ID starting with d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916 not found: ID does not exist" containerID="d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.030886 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916"} err="failed to get container status \"d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916\": rpc error: code = NotFound desc = could not find container \"d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916\": container with ID starting with d27a4f250190558a535d86f641f89ef0416485ddf0321d9eb0e890513cc2a916 not found: ID does not exist" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.337437 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b831ac-72a1-4f11-95c8-b3ee47275501" path="/var/lib/kubelet/pods/d6b831ac-72a1-4f11-95c8-b3ee47275501/volumes" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.338446 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" path="/var/lib/kubelet/pods/d7c8b502-0768-4226-aae0-f6e9f639cb9a/volumes" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.372006 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5479665bfd-x264b" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:33536->10.217.0.157:9311: read: connection reset by peer" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.372012 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5479665bfd-x264b" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:33544->10.217.0.157:9311: read: connection reset by peer" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.733504 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.773302 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6afcd9c-55bf-4a07-b8bf-311d452bc324-logs\") pod \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.773478 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-combined-ca-bundle\") pod \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.773677 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data-custom\") pod \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.773685 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6afcd9c-55bf-4a07-b8bf-311d452bc324-logs" (OuterVolumeSpecName: "logs") pod "d6afcd9c-55bf-4a07-b8bf-311d452bc324" (UID: "d6afcd9c-55bf-4a07-b8bf-311d452bc324"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.773769 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42x2k\" (UniqueName: \"kubernetes.io/projected/d6afcd9c-55bf-4a07-b8bf-311d452bc324-kube-api-access-42x2k\") pod \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.773855 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data\") pod \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\" (UID: \"d6afcd9c-55bf-4a07-b8bf-311d452bc324\") " Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.774556 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6afcd9c-55bf-4a07-b8bf-311d452bc324-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.777217 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d6afcd9c-55bf-4a07-b8bf-311d452bc324" (UID: "d6afcd9c-55bf-4a07-b8bf-311d452bc324"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.777463 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6afcd9c-55bf-4a07-b8bf-311d452bc324-kube-api-access-42x2k" (OuterVolumeSpecName: "kube-api-access-42x2k") pod "d6afcd9c-55bf-4a07-b8bf-311d452bc324" (UID: "d6afcd9c-55bf-4a07-b8bf-311d452bc324"). InnerVolumeSpecName "kube-api-access-42x2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.793978 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6afcd9c-55bf-4a07-b8bf-311d452bc324" (UID: "d6afcd9c-55bf-4a07-b8bf-311d452bc324"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.810570 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data" (OuterVolumeSpecName: "config-data") pod "d6afcd9c-55bf-4a07-b8bf-311d452bc324" (UID: "d6afcd9c-55bf-4a07-b8bf-311d452bc324"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.876673 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.876699 4728 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.876710 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42x2k\" (UniqueName: \"kubernetes.io/projected/d6afcd9c-55bf-4a07-b8bf-311d452bc324-kube-api-access-42x2k\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.876722 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6afcd9c-55bf-4a07-b8bf-311d452bc324-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.986336 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerStarted","Data":"d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65"} Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.990973 4728 generic.go:334] "Generic (PLEG): container finished" podID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerID="f6bb74a208461c725dfbc6af1eaaddea16dd114ab2552a9f85559a7b52ce3df0" exitCode=0 Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.991063 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e22486ac-1aae-4d93-9c43-8262f94761ff","Type":"ContainerDied","Data":"f6bb74a208461c725dfbc6af1eaaddea16dd114ab2552a9f85559a7b52ce3df0"} Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.996253 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerID="02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65" exitCode=0 Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.996301 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5479665bfd-x264b" event={"ID":"d6afcd9c-55bf-4a07-b8bf-311d452bc324","Type":"ContainerDied","Data":"02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65"} Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.996360 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5479665bfd-x264b" event={"ID":"d6afcd9c-55bf-4a07-b8bf-311d452bc324","Type":"ContainerDied","Data":"0295631be9b1a80b65bb9b7630b9c9394f86268e32b1076acb2b2f894161e66e"} Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.996386 4728 scope.go:117] "RemoveContainer" containerID="02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65" Jan 25 05:54:49 crc kubenswrapper[4728]: I0125 05:54:49.996498 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5479665bfd-x264b" Jan 25 05:54:50 crc kubenswrapper[4728]: I0125 05:54:50.050174 4728 scope.go:117] "RemoveContainer" containerID="f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa" Jan 25 05:54:50 crc kubenswrapper[4728]: I0125 05:54:50.052434 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5479665bfd-x264b"] Jan 25 05:54:50 crc kubenswrapper[4728]: I0125 05:54:50.061901 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5479665bfd-x264b"] Jan 25 05:54:50 crc kubenswrapper[4728]: I0125 05:54:50.067469 4728 scope.go:117] "RemoveContainer" containerID="02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65" Jan 25 05:54:50 crc kubenswrapper[4728]: E0125 05:54:50.067848 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65\": container with ID starting with 02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65 not found: ID does not exist" containerID="02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65" Jan 25 05:54:50 crc kubenswrapper[4728]: I0125 05:54:50.067878 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65"} err="failed to get container status \"02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65\": rpc error: code = NotFound desc = could not find container \"02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65\": container with ID starting with 02caba5219c3c8c43e611b527cbd7a874a6c04a280fc6bf700dd7f04c6674d65 not found: ID does not exist" Jan 25 05:54:50 crc kubenswrapper[4728]: I0125 05:54:50.067899 4728 scope.go:117] "RemoveContainer" containerID="f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa" Jan 25 05:54:50 crc kubenswrapper[4728]: E0125 05:54:50.068110 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa\": container with ID starting with f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa not found: ID does not exist" containerID="f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa" Jan 25 05:54:50 crc kubenswrapper[4728]: I0125 05:54:50.068131 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa"} err="failed to get container status \"f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa\": rpc error: code = NotFound desc = could not find container \"f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa\": container with ID starting with f5a23e261106d94c1df0cd871de1786e016141388f61427c21f1b9e1c0461bfa not found: ID does not exist" Jan 25 05:54:50 crc kubenswrapper[4728]: E0125 05:54:50.896253 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode22486ac_1aae_4d93_9c43_8262f94761ff.slice/crio-conmon-3b4788b39814e2c330813eb84ec97a6bb04229060dfef1b0ecaf003c272d21ad.scope\": RecentStats: unable to find data in memory cache]" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.010724 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerStarted","Data":"2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9"} Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.012427 4728 generic.go:334] "Generic (PLEG): container finished" podID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerID="3b4788b39814e2c330813eb84ec97a6bb04229060dfef1b0ecaf003c272d21ad" exitCode=0 Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.012491 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e22486ac-1aae-4d93-9c43-8262f94761ff","Type":"ContainerDied","Data":"3b4788b39814e2c330813eb84ec97a6bb04229060dfef1b0ecaf003c272d21ad"} Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.160421 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.210232 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxz8z\" (UniqueName: \"kubernetes.io/projected/e22486ac-1aae-4d93-9c43-8262f94761ff-kube-api-access-kxz8z\") pod \"e22486ac-1aae-4d93-9c43-8262f94761ff\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.210315 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e22486ac-1aae-4d93-9c43-8262f94761ff-etc-machine-id\") pod \"e22486ac-1aae-4d93-9c43-8262f94761ff\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.210519 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data-custom\") pod \"e22486ac-1aae-4d93-9c43-8262f94761ff\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.210591 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-combined-ca-bundle\") pod \"e22486ac-1aae-4d93-9c43-8262f94761ff\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.210755 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-scripts\") pod \"e22486ac-1aae-4d93-9c43-8262f94761ff\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.210781 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data\") pod \"e22486ac-1aae-4d93-9c43-8262f94761ff\" (UID: \"e22486ac-1aae-4d93-9c43-8262f94761ff\") " Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.216066 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22486ac-1aae-4d93-9c43-8262f94761ff-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e22486ac-1aae-4d93-9c43-8262f94761ff" (UID: "e22486ac-1aae-4d93-9c43-8262f94761ff"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.216421 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e22486ac-1aae-4d93-9c43-8262f94761ff" (UID: "e22486ac-1aae-4d93-9c43-8262f94761ff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.220480 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-scripts" (OuterVolumeSpecName: "scripts") pod "e22486ac-1aae-4d93-9c43-8262f94761ff" (UID: "e22486ac-1aae-4d93-9c43-8262f94761ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.223750 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22486ac-1aae-4d93-9c43-8262f94761ff-kube-api-access-kxz8z" (OuterVolumeSpecName: "kube-api-access-kxz8z") pod "e22486ac-1aae-4d93-9c43-8262f94761ff" (UID: "e22486ac-1aae-4d93-9c43-8262f94761ff"). InnerVolumeSpecName "kube-api-access-kxz8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.254474 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e22486ac-1aae-4d93-9c43-8262f94761ff" (UID: "e22486ac-1aae-4d93-9c43-8262f94761ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.303162 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data" (OuterVolumeSpecName: "config-data") pod "e22486ac-1aae-4d93-9c43-8262f94761ff" (UID: "e22486ac-1aae-4d93-9c43-8262f94761ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.313573 4728 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.313603 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.313617 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.313625 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22486ac-1aae-4d93-9c43-8262f94761ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.313634 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxz8z\" (UniqueName: \"kubernetes.io/projected/e22486ac-1aae-4d93-9c43-8262f94761ff-kube-api-access-kxz8z\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.313646 4728 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e22486ac-1aae-4d93-9c43-8262f94761ff-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.341793 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" path="/var/lib/kubelet/pods/d6afcd9c-55bf-4a07-b8bf-311d452bc324/volumes" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.535184 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54985bc57c-7dmw7" Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.595856 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9455cf8b5-xnn9p"] Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.596098 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9455cf8b5-xnn9p" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerName="neutron-api" containerID="cri-o://45b6c38a6adb6ecce8de335ca3fff0c4f011cf28e4e5b7fab52ca0f28844ae5b" gracePeriod=30 Jan 25 05:54:51 crc kubenswrapper[4728]: I0125 05:54:51.596604 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9455cf8b5-xnn9p" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerName="neutron-httpd" containerID="cri-o://0073b56744b284a02715ef6b0aa4ddac0bbf6877826227bf5355e772f97ecb13" gracePeriod=30 Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.030044 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerStarted","Data":"1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41"} Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.032932 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e22486ac-1aae-4d93-9c43-8262f94761ff","Type":"ContainerDied","Data":"587841ac3cb954286d81893d66d23e897a3b6b65ea488a344a02bc1925c4e885"} Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.033024 4728 scope.go:117] "RemoveContainer" containerID="f6bb74a208461c725dfbc6af1eaaddea16dd114ab2552a9f85559a7b52ce3df0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.033024 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.039741 4728 generic.go:334] "Generic (PLEG): container finished" podID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerID="0073b56744b284a02715ef6b0aa4ddac0bbf6877826227bf5355e772f97ecb13" exitCode=0 Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.039808 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9455cf8b5-xnn9p" event={"ID":"05936d12-72c1-4916-a564-2f4a886c2a0d","Type":"ContainerDied","Data":"0073b56744b284a02715ef6b0aa4ddac0bbf6877826227bf5355e772f97ecb13"} Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.065883 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.077895 4728 scope.go:117] "RemoveContainer" containerID="3b4788b39814e2c330813eb84ec97a6bb04229060dfef1b0ecaf003c272d21ad" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.085218 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.111436 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:52 crc kubenswrapper[4728]: E0125 05:54:52.111923 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api-log" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.111941 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api-log" Jan 25 05:54:52 crc kubenswrapper[4728]: E0125 05:54:52.111960 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.111966 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api" Jan 25 05:54:52 crc kubenswrapper[4728]: E0125 05:54:52.111985 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" containerName="init" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.111991 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" containerName="init" Jan 25 05:54:52 crc kubenswrapper[4728]: E0125 05:54:52.112007 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerName="probe" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.112012 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerName="probe" Jan 25 05:54:52 crc kubenswrapper[4728]: E0125 05:54:52.112026 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerName="cinder-scheduler" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.112031 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerName="cinder-scheduler" Jan 25 05:54:52 crc kubenswrapper[4728]: E0125 05:54:52.112043 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" containerName="dnsmasq-dns" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.112048 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" containerName="dnsmasq-dns" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.112193 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api-log" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.112206 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c8b502-0768-4226-aae0-f6e9f639cb9a" containerName="dnsmasq-dns" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.112218 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerName="probe" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.112226 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" containerName="cinder-scheduler" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.112235 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6afcd9c-55bf-4a07-b8bf-311d452bc324" containerName="barbican-api" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.113171 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.116522 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.118537 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.240572 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-config-data\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.240729 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgjb5\" (UniqueName: \"kubernetes.io/projected/e6ab997e-b648-4c08-9ba9-4166b43ebde2-kube-api-access-jgjb5\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.240883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.240925 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.241035 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6ab997e-b648-4c08-9ba9-4166b43ebde2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.241146 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-scripts\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.342990 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-scripts\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.343034 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-config-data\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.343083 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgjb5\" (UniqueName: \"kubernetes.io/projected/e6ab997e-b648-4c08-9ba9-4166b43ebde2-kube-api-access-jgjb5\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.343130 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.343151 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.343192 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6ab997e-b648-4c08-9ba9-4166b43ebde2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.343280 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6ab997e-b648-4c08-9ba9-4166b43ebde2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.347680 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-scripts\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.355973 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.359866 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgjb5\" (UniqueName: \"kubernetes.io/projected/e6ab997e-b648-4c08-9ba9-4166b43ebde2-kube-api-access-jgjb5\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.361185 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.361492 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-64dbb5f568-n5f5j" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.362860 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ab997e-b648-4c08-9ba9-4166b43ebde2-config-data\") pod \"cinder-scheduler-0\" (UID: \"e6ab997e-b648-4c08-9ba9-4166b43ebde2\") " pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.434105 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.848139 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55586495d8-v5qdm_16759a9d-9f02-4e00-b4e4-25ab295d6ffb/neutron-api/0.log" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.848471 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.960161 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdz4n\" (UniqueName: \"kubernetes.io/projected/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-kube-api-access-gdz4n\") pod \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.960225 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-config\") pod \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.960281 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-httpd-config\") pod \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.960314 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-combined-ca-bundle\") pod \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.960383 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-ovndb-tls-certs\") pod \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\" (UID: \"16759a9d-9f02-4e00-b4e4-25ab295d6ffb\") " Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.966671 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "16759a9d-9f02-4e00-b4e4-25ab295d6ffb" (UID: "16759a9d-9f02-4e00-b4e4-25ab295d6ffb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:52 crc kubenswrapper[4728]: I0125 05:54:52.966680 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-kube-api-access-gdz4n" (OuterVolumeSpecName: "kube-api-access-gdz4n") pod "16759a9d-9f02-4e00-b4e4-25ab295d6ffb" (UID: "16759a9d-9f02-4e00-b4e4-25ab295d6ffb"). InnerVolumeSpecName "kube-api-access-gdz4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.003510 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-config" (OuterVolumeSpecName: "config") pod "16759a9d-9f02-4e00-b4e4-25ab295d6ffb" (UID: "16759a9d-9f02-4e00-b4e4-25ab295d6ffb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.003531 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16759a9d-9f02-4e00-b4e4-25ab295d6ffb" (UID: "16759a9d-9f02-4e00-b4e4-25ab295d6ffb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.023500 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "16759a9d-9f02-4e00-b4e4-25ab295d6ffb" (UID: "16759a9d-9f02-4e00-b4e4-25ab295d6ffb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.062742 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.063079 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerStarted","Data":"5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85"} Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.063572 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.063999 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdz4n\" (UniqueName: \"kubernetes.io/projected/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-kube-api-access-gdz4n\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.064121 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.064134 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.064143 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.064151 4728 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16759a9d-9f02-4e00-b4e4-25ab295d6ffb-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:53 crc kubenswrapper[4728]: W0125 05:54:53.067073 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6ab997e_b648_4c08_9ba9_4166b43ebde2.slice/crio-20cbc57bf405ea079c2d2ba69b896c6a0b5d572bb851a96d8aa9cc69f2e2826a WatchSource:0}: Error finding container 20cbc57bf405ea079c2d2ba69b896c6a0b5d572bb851a96d8aa9cc69f2e2826a: Status 404 returned error can't find the container with id 20cbc57bf405ea079c2d2ba69b896c6a0b5d572bb851a96d8aa9cc69f2e2826a Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.068636 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55586495d8-v5qdm_16759a9d-9f02-4e00-b4e4-25ab295d6ffb/neutron-api/0.log" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.068695 4728 generic.go:334] "Generic (PLEG): container finished" podID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerID="b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358" exitCode=137 Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.068737 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55586495d8-v5qdm" event={"ID":"16759a9d-9f02-4e00-b4e4-25ab295d6ffb","Type":"ContainerDied","Data":"b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358"} Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.068776 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55586495d8-v5qdm" event={"ID":"16759a9d-9f02-4e00-b4e4-25ab295d6ffb","Type":"ContainerDied","Data":"5b3c14eb8b48ec8305fa632d99b594a356be56d374cf2814600a10f57f5525e1"} Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.068798 4728 scope.go:117] "RemoveContainer" containerID="0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.068932 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55586495d8-v5qdm" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.088725 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.709109522 podStartE2EDuration="6.088705828s" podCreationTimestamp="2026-01-25 05:54:47 +0000 UTC" firstStartedPulling="2026-01-25 05:54:48.804987851 +0000 UTC m=+979.840865831" lastFinishedPulling="2026-01-25 05:54:52.184584157 +0000 UTC m=+983.220462137" observedRunningTime="2026-01-25 05:54:53.08010609 +0000 UTC m=+984.115984070" watchObservedRunningTime="2026-01-25 05:54:53.088705828 +0000 UTC m=+984.124583808" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.105979 4728 scope.go:117] "RemoveContainer" containerID="b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.121989 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55586495d8-v5qdm"] Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.124392 4728 scope.go:117] "RemoveContainer" containerID="0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5" Jan 25 05:54:53 crc kubenswrapper[4728]: E0125 05:54:53.125609 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5\": container with ID starting with 0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5 not found: ID does not exist" containerID="0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.125712 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5"} err="failed to get container status \"0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5\": rpc error: code = NotFound desc = could not find container \"0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5\": container with ID starting with 0dedf770e1909218aae978a2c069e7aa95754e00a5349061303c973931a3f1e5 not found: ID does not exist" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.125797 4728 scope.go:117] "RemoveContainer" containerID="b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358" Jan 25 05:54:53 crc kubenswrapper[4728]: E0125 05:54:53.126143 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358\": container with ID starting with b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358 not found: ID does not exist" containerID="b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.126180 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358"} err="failed to get container status \"b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358\": rpc error: code = NotFound desc = could not find container \"b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358\": container with ID starting with b80d8552ce6646df30a8f7ff324f811384bbd53940594b67a07d607101217358 not found: ID does not exist" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.127796 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55586495d8-v5qdm"] Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.342565 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" path="/var/lib/kubelet/pods/16759a9d-9f02-4e00-b4e4-25ab295d6ffb/volumes" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.343168 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22486ac-1aae-4d93-9c43-8262f94761ff" path="/var/lib/kubelet/pods/e22486ac-1aae-4d93-9c43-8262f94761ff/volumes" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.863007 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 25 05:54:53 crc kubenswrapper[4728]: E0125 05:54:53.863708 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-api" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.863729 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-api" Jan 25 05:54:53 crc kubenswrapper[4728]: E0125 05:54:53.863750 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-httpd" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.863760 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-httpd" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.873771 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-httpd" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.873801 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="16759a9d-9f02-4e00-b4e4-25ab295d6ffb" containerName="neutron-api" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.874557 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.879895 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.880187 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.886541 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.888024 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-96mhm" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.982246 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.982348 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-openstack-config-secret\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.982396 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/693d3351-135b-490a-b613-60c730ac308f-openstack-config\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:53 crc kubenswrapper[4728]: I0125 05:54:53.982436 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrj64\" (UniqueName: \"kubernetes.io/projected/693d3351-135b-490a-b613-60c730ac308f-kube-api-access-mrj64\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.058159 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.086500 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/693d3351-135b-490a-b613-60c730ac308f-openstack-config\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.086659 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrj64\" (UniqueName: \"kubernetes.io/projected/693d3351-135b-490a-b613-60c730ac308f-kube-api-access-mrj64\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.086770 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.086852 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-openstack-config-secret\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.087394 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/693d3351-135b-490a-b613-60c730ac308f-openstack-config\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.104416 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.107391 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrj64\" (UniqueName: \"kubernetes.io/projected/693d3351-135b-490a-b613-60c730ac308f-kube-api-access-mrj64\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.108595 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6ab997e-b648-4c08-9ba9-4166b43ebde2","Type":"ContainerStarted","Data":"617aa3f617bf73f53666250ee971df1581fcece9d467485a129aae10f0eaea6e"} Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.108621 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6ab997e-b648-4c08-9ba9-4166b43ebde2","Type":"ContainerStarted","Data":"20cbc57bf405ea079c2d2ba69b896c6a0b5d572bb851a96d8aa9cc69f2e2826a"} Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.109650 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-openstack-config-secret\") pod \"openstackclient\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.194829 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.201760 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.223087 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.235942 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.241471 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.246273 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 25 05:54:54 crc kubenswrapper[4728]: E0125 05:54:54.310613 4728 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 25 05:54:54 crc kubenswrapper[4728]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_693d3351-135b-490a-b613-60c730ac308f_0(8a1ca8d41559dd5ce8e66006c1d6b3b4251b2c79a05e180453cc37df880026ae): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8a1ca8d41559dd5ce8e66006c1d6b3b4251b2c79a05e180453cc37df880026ae" Netns:"/var/run/netns/a26bafa4-56c7-4738-9dc4-b82a4d0fd3cf" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=8a1ca8d41559dd5ce8e66006c1d6b3b4251b2c79a05e180453cc37df880026ae;K8S_POD_UID=693d3351-135b-490a-b613-60c730ac308f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/693d3351-135b-490a-b613-60c730ac308f]: expected pod UID "693d3351-135b-490a-b613-60c730ac308f" but got "190d5aab-6cb5-4373-8e88-74ff4f94ca0e" from Kube API Jan 25 05:54:54 crc kubenswrapper[4728]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 25 05:54:54 crc kubenswrapper[4728]: > Jan 25 05:54:54 crc kubenswrapper[4728]: E0125 05:54:54.310706 4728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 25 05:54:54 crc kubenswrapper[4728]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_693d3351-135b-490a-b613-60c730ac308f_0(8a1ca8d41559dd5ce8e66006c1d6b3b4251b2c79a05e180453cc37df880026ae): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8a1ca8d41559dd5ce8e66006c1d6b3b4251b2c79a05e180453cc37df880026ae" Netns:"/var/run/netns/a26bafa4-56c7-4738-9dc4-b82a4d0fd3cf" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=8a1ca8d41559dd5ce8e66006c1d6b3b4251b2c79a05e180453cc37df880026ae;K8S_POD_UID=693d3351-135b-490a-b613-60c730ac308f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/693d3351-135b-490a-b613-60c730ac308f]: expected pod UID "693d3351-135b-490a-b613-60c730ac308f" but got "190d5aab-6cb5-4373-8e88-74ff4f94ca0e" from Kube API Jan 25 05:54:54 crc kubenswrapper[4728]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 25 05:54:54 crc kubenswrapper[4728]: > pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.397045 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-openstack-config\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.397089 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.397152 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-openstack-config-secret\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.397286 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksptm\" (UniqueName: \"kubernetes.io/projected/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-kube-api-access-ksptm\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.499184 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksptm\" (UniqueName: \"kubernetes.io/projected/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-kube-api-access-ksptm\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.499873 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-openstack-config\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.499955 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.500037 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-openstack-config-secret\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.500731 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-openstack-config\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.504763 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-openstack-config-secret\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.507278 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.516765 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksptm\" (UniqueName: \"kubernetes.io/projected/190d5aab-6cb5-4373-8e88-74ff4f94ca0e-kube-api-access-ksptm\") pod \"openstackclient\" (UID: \"190d5aab-6cb5-4373-8e88-74ff4f94ca0e\") " pod="openstack/openstackclient" Jan 25 05:54:54 crc kubenswrapper[4728]: I0125 05:54:54.648696 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.086466 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 25 05:54:55 crc kubenswrapper[4728]: W0125 05:54:55.090831 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod190d5aab_6cb5_4373_8e88_74ff4f94ca0e.slice/crio-65d5257c5c7b8280532efc4bbabdfec0c57cecd885b4d5684f34c37266491d90 WatchSource:0}: Error finding container 65d5257c5c7b8280532efc4bbabdfec0c57cecd885b4d5684f34c37266491d90: Status 404 returned error can't find the container with id 65d5257c5c7b8280532efc4bbabdfec0c57cecd885b4d5684f34c37266491d90 Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.117101 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6ab997e-b648-4c08-9ba9-4166b43ebde2","Type":"ContainerStarted","Data":"73fe1546b12c0f7ba509193bf10ade502dccd42868be3ad995b8a2b91ab5566e"} Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.117972 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"190d5aab-6cb5-4373-8e88-74ff4f94ca0e","Type":"ContainerStarted","Data":"65d5257c5c7b8280532efc4bbabdfec0c57cecd885b4d5684f34c37266491d90"} Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.117998 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.126919 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.134229 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.134220997 podStartE2EDuration="3.134220997s" podCreationTimestamp="2026-01-25 05:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:54:55.130010486 +0000 UTC m=+986.165888466" watchObservedRunningTime="2026-01-25 05:54:55.134220997 +0000 UTC m=+986.170098967" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.136855 4728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="693d3351-135b-490a-b613-60c730ac308f" podUID="190d5aab-6cb5-4373-8e88-74ff4f94ca0e" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.215064 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/693d3351-135b-490a-b613-60c730ac308f-openstack-config\") pod \"693d3351-135b-490a-b613-60c730ac308f\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.215144 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrj64\" (UniqueName: \"kubernetes.io/projected/693d3351-135b-490a-b613-60c730ac308f-kube-api-access-mrj64\") pod \"693d3351-135b-490a-b613-60c730ac308f\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.215228 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-combined-ca-bundle\") pod \"693d3351-135b-490a-b613-60c730ac308f\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.215383 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-openstack-config-secret\") pod \"693d3351-135b-490a-b613-60c730ac308f\" (UID: \"693d3351-135b-490a-b613-60c730ac308f\") " Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.215695 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/693d3351-135b-490a-b613-60c730ac308f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "693d3351-135b-490a-b613-60c730ac308f" (UID: "693d3351-135b-490a-b613-60c730ac308f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.216308 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/693d3351-135b-490a-b613-60c730ac308f-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.224307 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "693d3351-135b-490a-b613-60c730ac308f" (UID: "693d3351-135b-490a-b613-60c730ac308f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.226172 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693d3351-135b-490a-b613-60c730ac308f-kube-api-access-mrj64" (OuterVolumeSpecName: "kube-api-access-mrj64") pod "693d3351-135b-490a-b613-60c730ac308f" (UID: "693d3351-135b-490a-b613-60c730ac308f"). InnerVolumeSpecName "kube-api-access-mrj64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.226402 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "693d3351-135b-490a-b613-60c730ac308f" (UID: "693d3351-135b-490a-b613-60c730ac308f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.317796 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrj64\" (UniqueName: \"kubernetes.io/projected/693d3351-135b-490a-b613-60c730ac308f-kube-api-access-mrj64\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.317828 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.317837 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/693d3351-135b-490a-b613-60c730ac308f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:55 crc kubenswrapper[4728]: I0125 05:54:55.336940 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693d3351-135b-490a-b613-60c730ac308f" path="/var/lib/kubelet/pods/693d3351-135b-490a-b613-60c730ac308f/volumes" Jan 25 05:54:56 crc kubenswrapper[4728]: I0125 05:54:56.130772 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 25 05:54:56 crc kubenswrapper[4728]: I0125 05:54:56.141971 4728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="693d3351-135b-490a-b613-60c730ac308f" podUID="190d5aab-6cb5-4373-8e88-74ff4f94ca0e" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.148188 4728 generic.go:334] "Generic (PLEG): container finished" podID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerID="45b6c38a6adb6ecce8de335ca3fff0c4f011cf28e4e5b7fab52ca0f28844ae5b" exitCode=0 Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.148504 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9455cf8b5-xnn9p" event={"ID":"05936d12-72c1-4916-a564-2f4a886c2a0d","Type":"ContainerDied","Data":"45b6c38a6adb6ecce8de335ca3fff0c4f011cf28e4e5b7fab52ca0f28844ae5b"} Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.299731 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.358847 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-ovndb-tls-certs\") pod \"05936d12-72c1-4916-a564-2f4a886c2a0d\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.358912 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-httpd-config\") pod \"05936d12-72c1-4916-a564-2f4a886c2a0d\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.358986 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-public-tls-certs\") pod \"05936d12-72c1-4916-a564-2f4a886c2a0d\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.359041 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-config\") pod \"05936d12-72c1-4916-a564-2f4a886c2a0d\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.359200 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-internal-tls-certs\") pod \"05936d12-72c1-4916-a564-2f4a886c2a0d\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.359276 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdwbx\" (UniqueName: \"kubernetes.io/projected/05936d12-72c1-4916-a564-2f4a886c2a0d-kube-api-access-gdwbx\") pod \"05936d12-72c1-4916-a564-2f4a886c2a0d\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.359333 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-combined-ca-bundle\") pod \"05936d12-72c1-4916-a564-2f4a886c2a0d\" (UID: \"05936d12-72c1-4916-a564-2f4a886c2a0d\") " Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.367946 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05936d12-72c1-4916-a564-2f4a886c2a0d-kube-api-access-gdwbx" (OuterVolumeSpecName: "kube-api-access-gdwbx") pod "05936d12-72c1-4916-a564-2f4a886c2a0d" (UID: "05936d12-72c1-4916-a564-2f4a886c2a0d"). InnerVolumeSpecName "kube-api-access-gdwbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.369115 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "05936d12-72c1-4916-a564-2f4a886c2a0d" (UID: "05936d12-72c1-4916-a564-2f4a886c2a0d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.406094 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05936d12-72c1-4916-a564-2f4a886c2a0d" (UID: "05936d12-72c1-4916-a564-2f4a886c2a0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.418290 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "05936d12-72c1-4916-a564-2f4a886c2a0d" (UID: "05936d12-72c1-4916-a564-2f4a886c2a0d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.424561 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-config" (OuterVolumeSpecName: "config") pod "05936d12-72c1-4916-a564-2f4a886c2a0d" (UID: "05936d12-72c1-4916-a564-2f4a886c2a0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.425401 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "05936d12-72c1-4916-a564-2f4a886c2a0d" (UID: "05936d12-72c1-4916-a564-2f4a886c2a0d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.433499 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "05936d12-72c1-4916-a564-2f4a886c2a0d" (UID: "05936d12-72c1-4916-a564-2f4a886c2a0d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.434827 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.462209 4728 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.462374 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.462474 4728 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.462532 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdwbx\" (UniqueName: \"kubernetes.io/projected/05936d12-72c1-4916-a564-2f4a886c2a0d-kube-api-access-gdwbx\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.462628 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.462702 4728 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:57 crc kubenswrapper[4728]: I0125 05:54:57.462768 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05936d12-72c1-4916-a564-2f4a886c2a0d-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.177546 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9455cf8b5-xnn9p" event={"ID":"05936d12-72c1-4916-a564-2f4a886c2a0d","Type":"ContainerDied","Data":"4a228917f27cd5686220fc79ab81b314d5ae93e65112b53893ee747ae6b66c60"} Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.177612 4728 scope.go:117] "RemoveContainer" containerID="0073b56744b284a02715ef6b0aa4ddac0bbf6877826227bf5355e772f97ecb13" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.177716 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9455cf8b5-xnn9p" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.210337 4728 scope.go:117] "RemoveContainer" containerID="45b6c38a6adb6ecce8de335ca3fff0c4f011cf28e4e5b7fab52ca0f28844ae5b" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.211772 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9455cf8b5-xnn9p"] Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.218685 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9455cf8b5-xnn9p"] Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.949711 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5b9459545f-6l97s"] Jan 25 05:54:58 crc kubenswrapper[4728]: E0125 05:54:58.950430 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerName="neutron-httpd" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.950450 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerName="neutron-httpd" Jan 25 05:54:58 crc kubenswrapper[4728]: E0125 05:54:58.950460 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerName="neutron-api" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.950466 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerName="neutron-api" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.950665 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerName="neutron-httpd" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.950691 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" containerName="neutron-api" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.951664 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.954787 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.954987 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.955140 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 25 05:54:58 crc kubenswrapper[4728]: I0125 05:54:58.963702 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b9459545f-6l97s"] Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.091383 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a98851c-d86d-423f-a11c-a36fc78633a8-log-httpd\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.091518 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-public-tls-certs\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.091599 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a98851c-d86d-423f-a11c-a36fc78633a8-run-httpd\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.091848 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-internal-tls-certs\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.091967 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2a98851c-d86d-423f-a11c-a36fc78633a8-etc-swift\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.092179 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-config-data\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.092237 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-combined-ca-bundle\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.092505 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnkql\" (UniqueName: \"kubernetes.io/projected/2a98851c-d86d-423f-a11c-a36fc78633a8-kube-api-access-hnkql\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.194966 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2a98851c-d86d-423f-a11c-a36fc78633a8-etc-swift\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195032 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-config-data\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195059 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-combined-ca-bundle\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195122 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnkql\" (UniqueName: \"kubernetes.io/projected/2a98851c-d86d-423f-a11c-a36fc78633a8-kube-api-access-hnkql\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195206 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a98851c-d86d-423f-a11c-a36fc78633a8-log-httpd\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195267 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-public-tls-certs\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195295 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a98851c-d86d-423f-a11c-a36fc78633a8-run-httpd\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195337 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-internal-tls-certs\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195861 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a98851c-d86d-423f-a11c-a36fc78633a8-log-httpd\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.195915 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a98851c-d86d-423f-a11c-a36fc78633a8-run-httpd\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.200926 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-combined-ca-bundle\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.203852 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-internal-tls-certs\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.205629 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-public-tls-certs\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.205977 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a98851c-d86d-423f-a11c-a36fc78633a8-config-data\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.209470 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2a98851c-d86d-423f-a11c-a36fc78633a8-etc-swift\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.211013 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnkql\" (UniqueName: \"kubernetes.io/projected/2a98851c-d86d-423f-a11c-a36fc78633a8-kube-api-access-hnkql\") pod \"swift-proxy-5b9459545f-6l97s\" (UID: \"2a98851c-d86d-423f-a11c-a36fc78633a8\") " pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.266882 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.341973 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05936d12-72c1-4916-a564-2f4a886c2a0d" path="/var/lib/kubelet/pods/05936d12-72c1-4916-a564-2f4a886c2a0d/volumes" Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.588304 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.588551 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="ceilometer-central-agent" containerID="cri-o://d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65" gracePeriod=30 Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.588694 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="proxy-httpd" containerID="cri-o://5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85" gracePeriod=30 Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.588753 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="sg-core" containerID="cri-o://1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41" gracePeriod=30 Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.588766 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="ceilometer-notification-agent" containerID="cri-o://2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9" gracePeriod=30 Jan 25 05:54:59 crc kubenswrapper[4728]: I0125 05:54:59.777784 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b9459545f-6l97s"] Jan 25 05:54:59 crc kubenswrapper[4728]: W0125 05:54:59.780477 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a98851c_d86d_423f_a11c_a36fc78633a8.slice/crio-1952deaac9bf31760e1f4c7401c8aa4c4f6e9eeb386efb5ec3730023acfc59fd WatchSource:0}: Error finding container 1952deaac9bf31760e1f4c7401c8aa4c4f6e9eeb386efb5ec3730023acfc59fd: Status 404 returned error can't find the container with id 1952deaac9bf31760e1f4c7401c8aa4c4f6e9eeb386efb5ec3730023acfc59fd Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.202855 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b9459545f-6l97s" event={"ID":"2a98851c-d86d-423f-a11c-a36fc78633a8","Type":"ContainerStarted","Data":"42f248fadbb5b2f9aa1c4bddb10e19438c49a8cc7ab5faa82b6548b3bd326450"} Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.202925 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b9459545f-6l97s" event={"ID":"2a98851c-d86d-423f-a11c-a36fc78633a8","Type":"ContainerStarted","Data":"0c9de07e2e7b00709ceace466092b305ff2c930b3e2b2e7a1a8626fbbdad9942"} Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.202945 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b9459545f-6l97s" event={"ID":"2a98851c-d86d-423f-a11c-a36fc78633a8","Type":"ContainerStarted","Data":"1952deaac9bf31760e1f4c7401c8aa4c4f6e9eeb386efb5ec3730023acfc59fd"} Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.203019 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.210973 4728 generic.go:334] "Generic (PLEG): container finished" podID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerID="5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85" exitCode=0 Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.211022 4728 generic.go:334] "Generic (PLEG): container finished" podID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerID="1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41" exitCode=2 Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.211035 4728 generic.go:334] "Generic (PLEG): container finished" podID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerID="d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65" exitCode=0 Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.211038 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerDied","Data":"5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85"} Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.211082 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerDied","Data":"1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41"} Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.211095 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerDied","Data":"d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65"} Jan 25 05:55:00 crc kubenswrapper[4728]: I0125 05:55:00.232835 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5b9459545f-6l97s" podStartSLOduration=2.232793614 podStartE2EDuration="2.232793614s" podCreationTimestamp="2026-01-25 05:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:55:00.22359093 +0000 UTC m=+991.259468910" watchObservedRunningTime="2026-01-25 05:55:00.232793614 +0000 UTC m=+991.268671594" Jan 25 05:55:01 crc kubenswrapper[4728]: I0125 05:55:01.223144 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:55:01 crc kubenswrapper[4728]: I0125 05:55:01.989165 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.076383 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-scripts\") pod \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.076768 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-config-data\") pod \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.076847 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-sg-core-conf-yaml\") pod \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.077002 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckxrf\" (UniqueName: \"kubernetes.io/projected/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-kube-api-access-ckxrf\") pod \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.090077 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-run-httpd\") pod \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.090148 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-combined-ca-bundle\") pod \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.090223 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-log-httpd\") pod \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\" (UID: \"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b\") " Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.093202 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" (UID: "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.095464 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-kube-api-access-ckxrf" (OuterVolumeSpecName: "kube-api-access-ckxrf") pod "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" (UID: "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b"). InnerVolumeSpecName "kube-api-access-ckxrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.098444 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-scripts" (OuterVolumeSpecName: "scripts") pod "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" (UID: "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.098885 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" (UID: "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.109424 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" (UID: "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.163249 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-config-data" (OuterVolumeSpecName: "config-data") pod "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" (UID: "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.192431 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.192450 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.192463 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.192475 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckxrf\" (UniqueName: \"kubernetes.io/projected/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-kube-api-access-ckxrf\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.192501 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.192510 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.199358 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" (UID: "ff58bc3b-22fc-47fc-a364-4a6bc693fb6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.240085 4728 generic.go:334] "Generic (PLEG): container finished" podID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerID="2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9" exitCode=0 Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.240818 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.242588 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerDied","Data":"2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9"} Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.242640 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff58bc3b-22fc-47fc-a364-4a6bc693fb6b","Type":"ContainerDied","Data":"c0eac17da593a27684c639b98a19936ca733dcfeef871b35f89e53300f744e8a"} Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.242659 4728 scope.go:117] "RemoveContainer" containerID="5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.292001 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.295007 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.305748 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.316643 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:02 crc kubenswrapper[4728]: E0125 05:55:02.317058 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="proxy-httpd" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.317076 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="proxy-httpd" Jan 25 05:55:02 crc kubenswrapper[4728]: E0125 05:55:02.317095 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="ceilometer-notification-agent" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.317102 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="ceilometer-notification-agent" Jan 25 05:55:02 crc kubenswrapper[4728]: E0125 05:55:02.317110 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="sg-core" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.317117 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="sg-core" Jan 25 05:55:02 crc kubenswrapper[4728]: E0125 05:55:02.317140 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="ceilometer-central-agent" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.317148 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="ceilometer-central-agent" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.317331 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="ceilometer-central-agent" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.317357 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="ceilometer-notification-agent" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.317366 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="sg-core" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.317378 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" containerName="proxy-httpd" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.319033 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.321780 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.322977 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.325069 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.397728 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-config-data\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.397885 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-log-httpd\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.398064 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-run-httpd\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.398188 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.398391 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-scripts\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.398423 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.398759 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gggb\" (UniqueName: \"kubernetes.io/projected/6db7e175-0ee7-4017-ac12-dc357676e8e1-kube-api-access-9gggb\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.501534 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gggb\" (UniqueName: \"kubernetes.io/projected/6db7e175-0ee7-4017-ac12-dc357676e8e1-kube-api-access-9gggb\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.501623 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-config-data\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.501683 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-log-httpd\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.501746 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-run-httpd\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.501792 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.501826 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-scripts\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.501847 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.502490 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-run-httpd\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.502788 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-log-httpd\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.508248 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-config-data\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.508811 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.509901 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.510517 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-scripts\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.520362 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gggb\" (UniqueName: \"kubernetes.io/projected/6db7e175-0ee7-4017-ac12-dc357676e8e1-kube-api-access-9gggb\") pod \"ceilometer-0\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.641901 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:02 crc kubenswrapper[4728]: I0125 05:55:02.691578 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 25 05:55:03 crc kubenswrapper[4728]: I0125 05:55:03.339602 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff58bc3b-22fc-47fc-a364-4a6bc693fb6b" path="/var/lib/kubelet/pods/ff58bc3b-22fc-47fc-a364-4a6bc693fb6b/volumes" Jan 25 05:55:04 crc kubenswrapper[4728]: I0125 05:55:04.279498 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.141456 4728 scope.go:117] "RemoveContainer" containerID="1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.190106 4728 scope.go:117] "RemoveContainer" containerID="2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.323292 4728 scope.go:117] "RemoveContainer" containerID="d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.381694 4728 scope.go:117] "RemoveContainer" containerID="5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85" Jan 25 05:55:07 crc kubenswrapper[4728]: E0125 05:55:07.382712 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85\": container with ID starting with 5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85 not found: ID does not exist" containerID="5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.382753 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85"} err="failed to get container status \"5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85\": rpc error: code = NotFound desc = could not find container \"5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85\": container with ID starting with 5646075c56664e93a4187015db74dd0b12b77ac90da82f703fdc3b4850cfac85 not found: ID does not exist" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.382780 4728 scope.go:117] "RemoveContainer" containerID="1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41" Jan 25 05:55:07 crc kubenswrapper[4728]: E0125 05:55:07.383002 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41\": container with ID starting with 1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41 not found: ID does not exist" containerID="1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.383018 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41"} err="failed to get container status \"1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41\": rpc error: code = NotFound desc = could not find container \"1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41\": container with ID starting with 1e5318a0c424a8848a8cfc64a417b4be81ac20c9830f1a0dd65b408abb59fc41 not found: ID does not exist" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.383031 4728 scope.go:117] "RemoveContainer" containerID="2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9" Jan 25 05:55:07 crc kubenswrapper[4728]: E0125 05:55:07.383270 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9\": container with ID starting with 2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9 not found: ID does not exist" containerID="2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.383290 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9"} err="failed to get container status \"2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9\": rpc error: code = NotFound desc = could not find container \"2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9\": container with ID starting with 2a7e5d84f0fa17b07ef494efda9734380faf6f38dd9d986f458aca9192c116b9 not found: ID does not exist" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.383303 4728 scope.go:117] "RemoveContainer" containerID="d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65" Jan 25 05:55:07 crc kubenswrapper[4728]: E0125 05:55:07.383509 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65\": container with ID starting with d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65 not found: ID does not exist" containerID="d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.383524 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65"} err="failed to get container status \"d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65\": rpc error: code = NotFound desc = could not find container \"d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65\": container with ID starting with d88ea9830c3cbe3f83b4e5a76c0c8b46ac04e15474b0a7468cc39310db57cb65 not found: ID does not exist" Jan 25 05:55:07 crc kubenswrapper[4728]: I0125 05:55:07.620869 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:08 crc kubenswrapper[4728]: I0125 05:55:08.312823 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerStarted","Data":"058033f98ce4c02d7e8130ce1cf6aba0e844b5bd9a1dec4118e7e208a7cb7c03"} Jan 25 05:55:08 crc kubenswrapper[4728]: I0125 05:55:08.315990 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"190d5aab-6cb5-4373-8e88-74ff4f94ca0e","Type":"ContainerStarted","Data":"7207762c9d15fe5bbfae96087f056489e9d224b437362a39176350c2351e0320"} Jan 25 05:55:08 crc kubenswrapper[4728]: I0125 05:55:08.334389 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.226835978 podStartE2EDuration="14.33437518s" podCreationTimestamp="2026-01-25 05:54:54 +0000 UTC" firstStartedPulling="2026-01-25 05:54:55.09284023 +0000 UTC m=+986.128718211" lastFinishedPulling="2026-01-25 05:55:07.200379433 +0000 UTC m=+998.236257413" observedRunningTime="2026-01-25 05:55:08.330046077 +0000 UTC m=+999.365924057" watchObservedRunningTime="2026-01-25 05:55:08.33437518 +0000 UTC m=+999.370253160" Jan 25 05:55:09 crc kubenswrapper[4728]: I0125 05:55:09.273049 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b9459545f-6l97s" Jan 25 05:55:09 crc kubenswrapper[4728]: I0125 05:55:09.346387 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerStarted","Data":"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54"} Jan 25 05:55:09 crc kubenswrapper[4728]: I0125 05:55:09.346609 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerStarted","Data":"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae"} Jan 25 05:55:09 crc kubenswrapper[4728]: I0125 05:55:09.564368 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:10 crc kubenswrapper[4728]: I0125 05:55:10.354531 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerStarted","Data":"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc"} Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.392828 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hz4pk"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.399912 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.403058 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hz4pk"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.482199 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9gc2d"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.483687 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.495201 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-bbea-account-create-update-t7pck"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.499284 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.501174 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.519789 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bbea-account-create-update-t7pck"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.530858 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9gc2d"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.551818 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5739844d-c36a-4b25-b5e1-6049a9be36a5-operator-scripts\") pod \"nova-api-db-create-hz4pk\" (UID: \"5739844d-c36a-4b25-b5e1-6049a9be36a5\") " pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.551932 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257xq\" (UniqueName: \"kubernetes.io/projected/5739844d-c36a-4b25-b5e1-6049a9be36a5-kube-api-access-257xq\") pod \"nova-api-db-create-hz4pk\" (UID: \"5739844d-c36a-4b25-b5e1-6049a9be36a5\") " pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.596433 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-cxrhx"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.608479 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.612906 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cxrhx"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.654844 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zjm\" (UniqueName: \"kubernetes.io/projected/5a338507-ec0d-4723-9eff-8242791ac1e4-kube-api-access-t7zjm\") pod \"nova-api-bbea-account-create-update-t7pck\" (UID: \"5a338507-ec0d-4723-9eff-8242791ac1e4\") " pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.654951 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5739844d-c36a-4b25-b5e1-6049a9be36a5-operator-scripts\") pod \"nova-api-db-create-hz4pk\" (UID: \"5739844d-c36a-4b25-b5e1-6049a9be36a5\") " pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.655021 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17bc2814-7f95-490f-8ac3-2596dad8acc7-operator-scripts\") pod \"nova-cell0-db-create-9gc2d\" (UID: \"17bc2814-7f95-490f-8ac3-2596dad8acc7\") " pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.655104 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-operator-scripts\") pod \"nova-cell1-db-create-cxrhx\" (UID: \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\") " pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.655140 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257xq\" (UniqueName: \"kubernetes.io/projected/5739844d-c36a-4b25-b5e1-6049a9be36a5-kube-api-access-257xq\") pod \"nova-api-db-create-hz4pk\" (UID: \"5739844d-c36a-4b25-b5e1-6049a9be36a5\") " pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.655165 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r478\" (UniqueName: \"kubernetes.io/projected/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-kube-api-access-7r478\") pod \"nova-cell1-db-create-cxrhx\" (UID: \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\") " pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.655186 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krcmm\" (UniqueName: \"kubernetes.io/projected/17bc2814-7f95-490f-8ac3-2596dad8acc7-kube-api-access-krcmm\") pod \"nova-cell0-db-create-9gc2d\" (UID: \"17bc2814-7f95-490f-8ac3-2596dad8acc7\") " pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.655206 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a338507-ec0d-4723-9eff-8242791ac1e4-operator-scripts\") pod \"nova-api-bbea-account-create-update-t7pck\" (UID: \"5a338507-ec0d-4723-9eff-8242791ac1e4\") " pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.655874 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5739844d-c36a-4b25-b5e1-6049a9be36a5-operator-scripts\") pod \"nova-api-db-create-hz4pk\" (UID: \"5739844d-c36a-4b25-b5e1-6049a9be36a5\") " pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.678634 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257xq\" (UniqueName: \"kubernetes.io/projected/5739844d-c36a-4b25-b5e1-6049a9be36a5-kube-api-access-257xq\") pod \"nova-api-db-create-hz4pk\" (UID: \"5739844d-c36a-4b25-b5e1-6049a9be36a5\") " pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.683071 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e2fd-account-create-update-5n5s4"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.684528 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.690578 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.709136 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e2fd-account-create-update-5n5s4"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.714007 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.757405 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65872da3-e443-40db-8a63-96aea1382b3f-operator-scripts\") pod \"nova-cell0-e2fd-account-create-update-5n5s4\" (UID: \"65872da3-e443-40db-8a63-96aea1382b3f\") " pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.757729 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zjm\" (UniqueName: \"kubernetes.io/projected/5a338507-ec0d-4723-9eff-8242791ac1e4-kube-api-access-t7zjm\") pod \"nova-api-bbea-account-create-update-t7pck\" (UID: \"5a338507-ec0d-4723-9eff-8242791ac1e4\") " pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.757879 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17bc2814-7f95-490f-8ac3-2596dad8acc7-operator-scripts\") pod \"nova-cell0-db-create-9gc2d\" (UID: \"17bc2814-7f95-490f-8ac3-2596dad8acc7\") " pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.758000 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-operator-scripts\") pod \"nova-cell1-db-create-cxrhx\" (UID: \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\") " pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.758205 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r478\" (UniqueName: \"kubernetes.io/projected/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-kube-api-access-7r478\") pod \"nova-cell1-db-create-cxrhx\" (UID: \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\") " pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.758533 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krcmm\" (UniqueName: \"kubernetes.io/projected/17bc2814-7f95-490f-8ac3-2596dad8acc7-kube-api-access-krcmm\") pod \"nova-cell0-db-create-9gc2d\" (UID: \"17bc2814-7f95-490f-8ac3-2596dad8acc7\") " pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.758691 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a338507-ec0d-4723-9eff-8242791ac1e4-operator-scripts\") pod \"nova-api-bbea-account-create-update-t7pck\" (UID: \"5a338507-ec0d-4723-9eff-8242791ac1e4\") " pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.758807 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j74k\" (UniqueName: \"kubernetes.io/projected/65872da3-e443-40db-8a63-96aea1382b3f-kube-api-access-6j74k\") pod \"nova-cell0-e2fd-account-create-update-5n5s4\" (UID: \"65872da3-e443-40db-8a63-96aea1382b3f\") " pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.759539 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-operator-scripts\") pod \"nova-cell1-db-create-cxrhx\" (UID: \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\") " pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.759767 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a338507-ec0d-4723-9eff-8242791ac1e4-operator-scripts\") pod \"nova-api-bbea-account-create-update-t7pck\" (UID: \"5a338507-ec0d-4723-9eff-8242791ac1e4\") " pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.759908 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17bc2814-7f95-490f-8ac3-2596dad8acc7-operator-scripts\") pod \"nova-cell0-db-create-9gc2d\" (UID: \"17bc2814-7f95-490f-8ac3-2596dad8acc7\") " pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.772950 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zjm\" (UniqueName: \"kubernetes.io/projected/5a338507-ec0d-4723-9eff-8242791ac1e4-kube-api-access-t7zjm\") pod \"nova-api-bbea-account-create-update-t7pck\" (UID: \"5a338507-ec0d-4723-9eff-8242791ac1e4\") " pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.776861 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krcmm\" (UniqueName: \"kubernetes.io/projected/17bc2814-7f95-490f-8ac3-2596dad8acc7-kube-api-access-krcmm\") pod \"nova-cell0-db-create-9gc2d\" (UID: \"17bc2814-7f95-490f-8ac3-2596dad8acc7\") " pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.778517 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r478\" (UniqueName: \"kubernetes.io/projected/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-kube-api-access-7r478\") pod \"nova-cell1-db-create-cxrhx\" (UID: \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\") " pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.795304 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.814081 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.863448 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j74k\" (UniqueName: \"kubernetes.io/projected/65872da3-e443-40db-8a63-96aea1382b3f-kube-api-access-6j74k\") pod \"nova-cell0-e2fd-account-create-update-5n5s4\" (UID: \"65872da3-e443-40db-8a63-96aea1382b3f\") " pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.863720 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65872da3-e443-40db-8a63-96aea1382b3f-operator-scripts\") pod \"nova-cell0-e2fd-account-create-update-5n5s4\" (UID: \"65872da3-e443-40db-8a63-96aea1382b3f\") " pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.864562 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65872da3-e443-40db-8a63-96aea1382b3f-operator-scripts\") pod \"nova-cell0-e2fd-account-create-update-5n5s4\" (UID: \"65872da3-e443-40db-8a63-96aea1382b3f\") " pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.923986 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4eec-account-create-update-m2z8d"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.925956 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.926522 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.931106 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.939417 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j74k\" (UniqueName: \"kubernetes.io/projected/65872da3-e443-40db-8a63-96aea1382b3f-kube-api-access-6j74k\") pod \"nova-cell0-e2fd-account-create-update-5n5s4\" (UID: \"65872da3-e443-40db-8a63-96aea1382b3f\") " pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.955657 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4eec-account-create-update-m2z8d"] Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.973372 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsxjl\" (UniqueName: \"kubernetes.io/projected/0154728c-361d-44a4-85ca-39167e69fc69-kube-api-access-jsxjl\") pod \"nova-cell1-4eec-account-create-update-m2z8d\" (UID: \"0154728c-361d-44a4-85ca-39167e69fc69\") " pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:11 crc kubenswrapper[4728]: I0125 05:55:11.973445 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0154728c-361d-44a4-85ca-39167e69fc69-operator-scripts\") pod \"nova-cell1-4eec-account-create-update-m2z8d\" (UID: \"0154728c-361d-44a4-85ca-39167e69fc69\") " pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.027620 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.077398 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsxjl\" (UniqueName: \"kubernetes.io/projected/0154728c-361d-44a4-85ca-39167e69fc69-kube-api-access-jsxjl\") pod \"nova-cell1-4eec-account-create-update-m2z8d\" (UID: \"0154728c-361d-44a4-85ca-39167e69fc69\") " pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.077502 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0154728c-361d-44a4-85ca-39167e69fc69-operator-scripts\") pod \"nova-cell1-4eec-account-create-update-m2z8d\" (UID: \"0154728c-361d-44a4-85ca-39167e69fc69\") " pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.078525 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0154728c-361d-44a4-85ca-39167e69fc69-operator-scripts\") pod \"nova-cell1-4eec-account-create-update-m2z8d\" (UID: \"0154728c-361d-44a4-85ca-39167e69fc69\") " pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.099286 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsxjl\" (UniqueName: \"kubernetes.io/projected/0154728c-361d-44a4-85ca-39167e69fc69-kube-api-access-jsxjl\") pod \"nova-cell1-4eec-account-create-update-m2z8d\" (UID: \"0154728c-361d-44a4-85ca-39167e69fc69\") " pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.195693 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9gc2d"] Jan 25 05:55:12 crc kubenswrapper[4728]: W0125 05:55:12.207565 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17bc2814_7f95_490f_8ac3_2596dad8acc7.slice/crio-35910ae273212f0f260a5b38781615085b42af3a5d0f0d72f5632f7a8ee9a6f2 WatchSource:0}: Error finding container 35910ae273212f0f260a5b38781615085b42af3a5d0f0d72f5632f7a8ee9a6f2: Status 404 returned error can't find the container with id 35910ae273212f0f260a5b38781615085b42af3a5d0f0d72f5632f7a8ee9a6f2 Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.214883 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hz4pk"] Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.305261 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.385399 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerStarted","Data":"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03"} Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.385624 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="ceilometer-central-agent" containerID="cri-o://33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae" gracePeriod=30 Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.385788 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="proxy-httpd" containerID="cri-o://61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03" gracePeriod=30 Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.385834 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="sg-core" containerID="cri-o://b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc" gracePeriod=30 Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.385893 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="ceilometer-notification-agent" containerID="cri-o://d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54" gracePeriod=30 Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.385908 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.392894 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9gc2d" event={"ID":"17bc2814-7f95-490f-8ac3-2596dad8acc7","Type":"ContainerStarted","Data":"35910ae273212f0f260a5b38781615085b42af3a5d0f0d72f5632f7a8ee9a6f2"} Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.395526 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hz4pk" event={"ID":"5739844d-c36a-4b25-b5e1-6049a9be36a5","Type":"ContainerStarted","Data":"d1b2aa0e663ce3cdd4a9d3c3bb1fbeb896c99a1d626d72ed2efa9f42d7ab49cf"} Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.421934 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.823317126 podStartE2EDuration="10.421911929s" podCreationTimestamp="2026-01-25 05:55:02 +0000 UTC" firstStartedPulling="2026-01-25 05:55:07.62592193 +0000 UTC m=+998.661799910" lastFinishedPulling="2026-01-25 05:55:11.224516743 +0000 UTC m=+1002.260394713" observedRunningTime="2026-01-25 05:55:12.407553171 +0000 UTC m=+1003.443431151" watchObservedRunningTime="2026-01-25 05:55:12.421911929 +0000 UTC m=+1003.457789908" Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.549090 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bbea-account-create-update-t7pck"] Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.566192 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e2fd-account-create-update-5n5s4"] Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.639110 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cxrhx"] Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.826193 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4eec-account-create-update-m2z8d"] Jan 25 05:55:12 crc kubenswrapper[4728]: W0125 05:55:12.880159 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0154728c_361d_44a4_85ca_39167e69fc69.slice/crio-fbacaa9c4eac72f056eceae3fd8bf92fa99abd45e8242e3fd1e7e6d2d6bea77d WatchSource:0}: Error finding container fbacaa9c4eac72f056eceae3fd8bf92fa99abd45e8242e3fd1e7e6d2d6bea77d: Status 404 returned error can't find the container with id fbacaa9c4eac72f056eceae3fd8bf92fa99abd45e8242e3fd1e7e6d2d6bea77d Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.900029 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:55:12 crc kubenswrapper[4728]: I0125 05:55:12.900083 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.042297 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.199113 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-scripts\") pod \"6db7e175-0ee7-4017-ac12-dc357676e8e1\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.199240 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-log-httpd\") pod \"6db7e175-0ee7-4017-ac12-dc357676e8e1\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.199295 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-run-httpd\") pod \"6db7e175-0ee7-4017-ac12-dc357676e8e1\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.199386 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-sg-core-conf-yaml\") pod \"6db7e175-0ee7-4017-ac12-dc357676e8e1\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.199433 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-config-data\") pod \"6db7e175-0ee7-4017-ac12-dc357676e8e1\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.199529 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-combined-ca-bundle\") pod \"6db7e175-0ee7-4017-ac12-dc357676e8e1\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.199580 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gggb\" (UniqueName: \"kubernetes.io/projected/6db7e175-0ee7-4017-ac12-dc357676e8e1-kube-api-access-9gggb\") pod \"6db7e175-0ee7-4017-ac12-dc357676e8e1\" (UID: \"6db7e175-0ee7-4017-ac12-dc357676e8e1\") " Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.199870 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6db7e175-0ee7-4017-ac12-dc357676e8e1" (UID: "6db7e175-0ee7-4017-ac12-dc357676e8e1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.200178 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.200576 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6db7e175-0ee7-4017-ac12-dc357676e8e1" (UID: "6db7e175-0ee7-4017-ac12-dc357676e8e1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.210831 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-scripts" (OuterVolumeSpecName: "scripts") pod "6db7e175-0ee7-4017-ac12-dc357676e8e1" (UID: "6db7e175-0ee7-4017-ac12-dc357676e8e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.223494 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db7e175-0ee7-4017-ac12-dc357676e8e1-kube-api-access-9gggb" (OuterVolumeSpecName: "kube-api-access-9gggb") pod "6db7e175-0ee7-4017-ac12-dc357676e8e1" (UID: "6db7e175-0ee7-4017-ac12-dc357676e8e1"). InnerVolumeSpecName "kube-api-access-9gggb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.228820 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6db7e175-0ee7-4017-ac12-dc357676e8e1" (UID: "6db7e175-0ee7-4017-ac12-dc357676e8e1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.258131 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6db7e175-0ee7-4017-ac12-dc357676e8e1" (UID: "6db7e175-0ee7-4017-ac12-dc357676e8e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.281908 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-config-data" (OuterVolumeSpecName: "config-data") pod "6db7e175-0ee7-4017-ac12-dc357676e8e1" (UID: "6db7e175-0ee7-4017-ac12-dc357676e8e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.302973 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.303010 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.303029 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gggb\" (UniqueName: \"kubernetes.io/projected/6db7e175-0ee7-4017-ac12-dc357676e8e1-kube-api-access-9gggb\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.303044 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.303053 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6db7e175-0ee7-4017-ac12-dc357676e8e1-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.303063 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6db7e175-0ee7-4017-ac12-dc357676e8e1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.406428 4728 generic.go:334] "Generic (PLEG): container finished" podID="5a338507-ec0d-4723-9eff-8242791ac1e4" containerID="1105474da8029154b0e2d922ce0acf9a4715799696e38890160e1a9bdcc93af9" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.406529 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bbea-account-create-update-t7pck" event={"ID":"5a338507-ec0d-4723-9eff-8242791ac1e4","Type":"ContainerDied","Data":"1105474da8029154b0e2d922ce0acf9a4715799696e38890160e1a9bdcc93af9"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.406566 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bbea-account-create-update-t7pck" event={"ID":"5a338507-ec0d-4723-9eff-8242791ac1e4","Type":"ContainerStarted","Data":"75973f1f5739768ef239ebdd43b7cd4e5031f3fc7b27de315137e95e9996b124"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.411061 4728 generic.go:334] "Generic (PLEG): container finished" podID="65872da3-e443-40db-8a63-96aea1382b3f" containerID="1249916ce16c9d7ed67ca807e9a4d754240ae61e4decc5e91ccd83963528b1ac" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.411124 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" event={"ID":"65872da3-e443-40db-8a63-96aea1382b3f","Type":"ContainerDied","Data":"1249916ce16c9d7ed67ca807e9a4d754240ae61e4decc5e91ccd83963528b1ac"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.411164 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" event={"ID":"65872da3-e443-40db-8a63-96aea1382b3f","Type":"ContainerStarted","Data":"f218185ec11088972c8f5c40016b24e7047bf4301266676a5bb9ccaa219b6dd9"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.414768 4728 generic.go:334] "Generic (PLEG): container finished" podID="17bc2814-7f95-490f-8ac3-2596dad8acc7" containerID="6b6ea53435ef08771550d35acdc38583da5bb60b7bc6538c81902756543cfe99" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.414884 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9gc2d" event={"ID":"17bc2814-7f95-490f-8ac3-2596dad8acc7","Type":"ContainerDied","Data":"6b6ea53435ef08771550d35acdc38583da5bb60b7bc6538c81902756543cfe99"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.437941 4728 generic.go:334] "Generic (PLEG): container finished" podID="0154728c-361d-44a4-85ca-39167e69fc69" containerID="2ded8a485fbb83fb007a59ec779a1bd8cb40c14224ec013eca052af6927fdb08" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.438196 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" event={"ID":"0154728c-361d-44a4-85ca-39167e69fc69","Type":"ContainerDied","Data":"2ded8a485fbb83fb007a59ec779a1bd8cb40c14224ec013eca052af6927fdb08"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.438266 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" event={"ID":"0154728c-361d-44a4-85ca-39167e69fc69","Type":"ContainerStarted","Data":"fbacaa9c4eac72f056eceae3fd8bf92fa99abd45e8242e3fd1e7e6d2d6bea77d"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.447189 4728 generic.go:334] "Generic (PLEG): container finished" podID="5739844d-c36a-4b25-b5e1-6049a9be36a5" containerID="b1bd0c270fc8ad76a9b0d0904d0f0d261f0d39f58bfc0bc1be3adf8b2548d686" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.447367 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hz4pk" event={"ID":"5739844d-c36a-4b25-b5e1-6049a9be36a5","Type":"ContainerDied","Data":"b1bd0c270fc8ad76a9b0d0904d0f0d261f0d39f58bfc0bc1be3adf8b2548d686"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453084 4728 generic.go:334] "Generic (PLEG): container finished" podID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerID="61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453117 4728 generic.go:334] "Generic (PLEG): container finished" podID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerID="b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc" exitCode=2 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453126 4728 generic.go:334] "Generic (PLEG): container finished" podID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerID="d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453134 4728 generic.go:334] "Generic (PLEG): container finished" podID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerID="33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453278 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453335 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerDied","Data":"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453379 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerDied","Data":"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453396 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerDied","Data":"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453416 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerDied","Data":"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453427 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6db7e175-0ee7-4017-ac12-dc357676e8e1","Type":"ContainerDied","Data":"058033f98ce4c02d7e8130ce1cf6aba0e844b5bd9a1dec4118e7e208a7cb7c03"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.453447 4728 scope.go:117] "RemoveContainer" containerID="61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.474372 4728 generic.go:334] "Generic (PLEG): container finished" podID="33c35f4a-a6c8-4be9-9ff5-558125a8e2bb" containerID="500f560ec3f40f191a19e50ca98e8c3ffc055aa51f434859f9c351444f7f3edc" exitCode=0 Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.474422 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cxrhx" event={"ID":"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb","Type":"ContainerDied","Data":"500f560ec3f40f191a19e50ca98e8c3ffc055aa51f434859f9c351444f7f3edc"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.474445 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cxrhx" event={"ID":"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb","Type":"ContainerStarted","Data":"f8165ca20d466dae2ea2356d43e072dad7dbabf90d7dcbbd36f81dea11d2b0d6"} Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.505215 4728 scope.go:117] "RemoveContainer" containerID="b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.537404 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.540480 4728 scope.go:117] "RemoveContainer" containerID="d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.556787 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.567205 4728 scope.go:117] "RemoveContainer" containerID="33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.577057 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:13 crc kubenswrapper[4728]: E0125 05:55:13.578648 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="sg-core" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.578677 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="sg-core" Jan 25 05:55:13 crc kubenswrapper[4728]: E0125 05:55:13.578692 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="ceilometer-central-agent" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.578907 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="ceilometer-central-agent" Jan 25 05:55:13 crc kubenswrapper[4728]: E0125 05:55:13.578947 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="proxy-httpd" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.578957 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="proxy-httpd" Jan 25 05:55:13 crc kubenswrapper[4728]: E0125 05:55:13.578973 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="ceilometer-notification-agent" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.578979 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="ceilometer-notification-agent" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.581919 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="ceilometer-notification-agent" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.581944 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="sg-core" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.581968 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="proxy-httpd" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.581980 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" containerName="ceilometer-central-agent" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.583788 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.583889 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.585472 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.586045 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.588286 4728 scope.go:117] "RemoveContainer" containerID="61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03" Jan 25 05:55:13 crc kubenswrapper[4728]: E0125 05:55:13.589237 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": container with ID starting with 61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03 not found: ID does not exist" containerID="61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.589272 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03"} err="failed to get container status \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": rpc error: code = NotFound desc = could not find container \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": container with ID starting with 61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03 not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.589295 4728 scope.go:117] "RemoveContainer" containerID="b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc" Jan 25 05:55:13 crc kubenswrapper[4728]: E0125 05:55:13.589650 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": container with ID starting with b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc not found: ID does not exist" containerID="b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.589697 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc"} err="failed to get container status \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": rpc error: code = NotFound desc = could not find container \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": container with ID starting with b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.589726 4728 scope.go:117] "RemoveContainer" containerID="d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54" Jan 25 05:55:13 crc kubenswrapper[4728]: E0125 05:55:13.590030 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": container with ID starting with d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54 not found: ID does not exist" containerID="d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.590057 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54"} err="failed to get container status \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": rpc error: code = NotFound desc = could not find container \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": container with ID starting with d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54 not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.590073 4728 scope.go:117] "RemoveContainer" containerID="33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae" Jan 25 05:55:13 crc kubenswrapper[4728]: E0125 05:55:13.590314 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": container with ID starting with 33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae not found: ID does not exist" containerID="33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.590352 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae"} err="failed to get container status \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": rpc error: code = NotFound desc = could not find container \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": container with ID starting with 33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.590365 4728 scope.go:117] "RemoveContainer" containerID="61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.590555 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03"} err="failed to get container status \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": rpc error: code = NotFound desc = could not find container \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": container with ID starting with 61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03 not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.590572 4728 scope.go:117] "RemoveContainer" containerID="b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.591990 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc"} err="failed to get container status \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": rpc error: code = NotFound desc = could not find container \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": container with ID starting with b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.592011 4728 scope.go:117] "RemoveContainer" containerID="d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.592413 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54"} err="failed to get container status \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": rpc error: code = NotFound desc = could not find container \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": container with ID starting with d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54 not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.592438 4728 scope.go:117] "RemoveContainer" containerID="33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.592968 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae"} err="failed to get container status \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": rpc error: code = NotFound desc = could not find container \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": container with ID starting with 33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.593005 4728 scope.go:117] "RemoveContainer" containerID="61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.593401 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03"} err="failed to get container status \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": rpc error: code = NotFound desc = could not find container \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": container with ID starting with 61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03 not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.593422 4728 scope.go:117] "RemoveContainer" containerID="b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.597564 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc"} err="failed to get container status \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": rpc error: code = NotFound desc = could not find container \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": container with ID starting with b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.597595 4728 scope.go:117] "RemoveContainer" containerID="d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.597808 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54"} err="failed to get container status \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": rpc error: code = NotFound desc = could not find container \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": container with ID starting with d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54 not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.597837 4728 scope.go:117] "RemoveContainer" containerID="33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.598879 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae"} err="failed to get container status \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": rpc error: code = NotFound desc = could not find container \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": container with ID starting with 33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.598907 4728 scope.go:117] "RemoveContainer" containerID="61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.599873 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03"} err="failed to get container status \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": rpc error: code = NotFound desc = could not find container \"61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03\": container with ID starting with 61d66efb02da96cf843cf897b80f6137c54d409afd5eff6bef9535e0d614be03 not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.599902 4728 scope.go:117] "RemoveContainer" containerID="b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.600853 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc"} err="failed to get container status \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": rpc error: code = NotFound desc = could not find container \"b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc\": container with ID starting with b8b1d312f8098abfa4d17463adb0959a943ef1ce141d41533f8c18521fd6f6bc not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.600880 4728 scope.go:117] "RemoveContainer" containerID="d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.601578 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54"} err="failed to get container status \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": rpc error: code = NotFound desc = could not find container \"d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54\": container with ID starting with d8d5cfe1407014ff59362518af5679a053cefc7f7325a21a3d0576e70227fb54 not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.601609 4728 scope.go:117] "RemoveContainer" containerID="33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.601992 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae"} err="failed to get container status \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": rpc error: code = NotFound desc = could not find container \"33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae\": container with ID starting with 33cac043a295b1dffa141bd58e61c3b0d2c50344c92a0fcf3c7ad6e1056c7bae not found: ID does not exist" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.609775 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-run-httpd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.609808 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-config-data\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.609896 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.609933 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6drd\" (UniqueName: \"kubernetes.io/projected/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-kube-api-access-z6drd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.609970 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-scripts\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.610017 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.610054 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-log-httpd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.714846 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.714919 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6drd\" (UniqueName: \"kubernetes.io/projected/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-kube-api-access-z6drd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.715614 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-scripts\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.715699 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.716123 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-log-httpd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.716491 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-log-httpd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.716642 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-run-httpd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.716695 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-config-data\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.716964 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-run-httpd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.718352 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.719224 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-scripts\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.730345 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-config-data\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.730711 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.730769 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6drd\" (UniqueName: \"kubernetes.io/projected/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-kube-api-access-z6drd\") pod \"ceilometer-0\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " pod="openstack/ceilometer-0" Jan 25 05:55:13 crc kubenswrapper[4728]: I0125 05:55:13.902107 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.340555 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.485739 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerStarted","Data":"e85e8e4e14d734efc00c5c2f7d7ba2a6b3e8e4a2852536a21dc9f703fb39b24f"} Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.843913 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.952467 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r478\" (UniqueName: \"kubernetes.io/projected/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-kube-api-access-7r478\") pod \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\" (UID: \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\") " Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.952607 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-operator-scripts\") pod \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\" (UID: \"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb\") " Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.953117 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33c35f4a-a6c8-4be9-9ff5-558125a8e2bb" (UID: "33c35f4a-a6c8-4be9-9ff5-558125a8e2bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.956307 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.957982 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-kube-api-access-7r478" (OuterVolumeSpecName: "kube-api-access-7r478") pod "33c35f4a-a6c8-4be9-9ff5-558125a8e2bb" (UID: "33c35f4a-a6c8-4be9-9ff5-558125a8e2bb"). InnerVolumeSpecName "kube-api-access-7r478". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.960938 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.964843 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.973508 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:14 crc kubenswrapper[4728]: I0125 05:55:14.977181 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054216 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17bc2814-7f95-490f-8ac3-2596dad8acc7-operator-scripts\") pod \"17bc2814-7f95-490f-8ac3-2596dad8acc7\" (UID: \"17bc2814-7f95-490f-8ac3-2596dad8acc7\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054275 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5739844d-c36a-4b25-b5e1-6049a9be36a5-operator-scripts\") pod \"5739844d-c36a-4b25-b5e1-6049a9be36a5\" (UID: \"5739844d-c36a-4b25-b5e1-6049a9be36a5\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054341 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-257xq\" (UniqueName: \"kubernetes.io/projected/5739844d-c36a-4b25-b5e1-6049a9be36a5-kube-api-access-257xq\") pod \"5739844d-c36a-4b25-b5e1-6049a9be36a5\" (UID: \"5739844d-c36a-4b25-b5e1-6049a9be36a5\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054396 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7zjm\" (UniqueName: \"kubernetes.io/projected/5a338507-ec0d-4723-9eff-8242791ac1e4-kube-api-access-t7zjm\") pod \"5a338507-ec0d-4723-9eff-8242791ac1e4\" (UID: \"5a338507-ec0d-4723-9eff-8242791ac1e4\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054418 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0154728c-361d-44a4-85ca-39167e69fc69-operator-scripts\") pod \"0154728c-361d-44a4-85ca-39167e69fc69\" (UID: \"0154728c-361d-44a4-85ca-39167e69fc69\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054483 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j74k\" (UniqueName: \"kubernetes.io/projected/65872da3-e443-40db-8a63-96aea1382b3f-kube-api-access-6j74k\") pod \"65872da3-e443-40db-8a63-96aea1382b3f\" (UID: \"65872da3-e443-40db-8a63-96aea1382b3f\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054513 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsxjl\" (UniqueName: \"kubernetes.io/projected/0154728c-361d-44a4-85ca-39167e69fc69-kube-api-access-jsxjl\") pod \"0154728c-361d-44a4-85ca-39167e69fc69\" (UID: \"0154728c-361d-44a4-85ca-39167e69fc69\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054577 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krcmm\" (UniqueName: \"kubernetes.io/projected/17bc2814-7f95-490f-8ac3-2596dad8acc7-kube-api-access-krcmm\") pod \"17bc2814-7f95-490f-8ac3-2596dad8acc7\" (UID: \"17bc2814-7f95-490f-8ac3-2596dad8acc7\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054636 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65872da3-e443-40db-8a63-96aea1382b3f-operator-scripts\") pod \"65872da3-e443-40db-8a63-96aea1382b3f\" (UID: \"65872da3-e443-40db-8a63-96aea1382b3f\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054662 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a338507-ec0d-4723-9eff-8242791ac1e4-operator-scripts\") pod \"5a338507-ec0d-4723-9eff-8242791ac1e4\" (UID: \"5a338507-ec0d-4723-9eff-8242791ac1e4\") " Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054674 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17bc2814-7f95-490f-8ac3-2596dad8acc7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17bc2814-7f95-490f-8ac3-2596dad8acc7" (UID: "17bc2814-7f95-490f-8ac3-2596dad8acc7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.054815 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5739844d-c36a-4b25-b5e1-6049a9be36a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5739844d-c36a-4b25-b5e1-6049a9be36a5" (UID: "5739844d-c36a-4b25-b5e1-6049a9be36a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.055058 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r478\" (UniqueName: \"kubernetes.io/projected/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-kube-api-access-7r478\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.055075 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17bc2814-7f95-490f-8ac3-2596dad8acc7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.055085 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5739844d-c36a-4b25-b5e1-6049a9be36a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.055096 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.055402 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0154728c-361d-44a4-85ca-39167e69fc69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0154728c-361d-44a4-85ca-39167e69fc69" (UID: "0154728c-361d-44a4-85ca-39167e69fc69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.055492 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a338507-ec0d-4723-9eff-8242791ac1e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a338507-ec0d-4723-9eff-8242791ac1e4" (UID: "5a338507-ec0d-4723-9eff-8242791ac1e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.055751 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65872da3-e443-40db-8a63-96aea1382b3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65872da3-e443-40db-8a63-96aea1382b3f" (UID: "65872da3-e443-40db-8a63-96aea1382b3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.058374 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a338507-ec0d-4723-9eff-8242791ac1e4-kube-api-access-t7zjm" (OuterVolumeSpecName: "kube-api-access-t7zjm") pod "5a338507-ec0d-4723-9eff-8242791ac1e4" (UID: "5a338507-ec0d-4723-9eff-8242791ac1e4"). InnerVolumeSpecName "kube-api-access-t7zjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.058873 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65872da3-e443-40db-8a63-96aea1382b3f-kube-api-access-6j74k" (OuterVolumeSpecName: "kube-api-access-6j74k") pod "65872da3-e443-40db-8a63-96aea1382b3f" (UID: "65872da3-e443-40db-8a63-96aea1382b3f"). InnerVolumeSpecName "kube-api-access-6j74k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.059031 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0154728c-361d-44a4-85ca-39167e69fc69-kube-api-access-jsxjl" (OuterVolumeSpecName: "kube-api-access-jsxjl") pod "0154728c-361d-44a4-85ca-39167e69fc69" (UID: "0154728c-361d-44a4-85ca-39167e69fc69"). InnerVolumeSpecName "kube-api-access-jsxjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.060678 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5739844d-c36a-4b25-b5e1-6049a9be36a5-kube-api-access-257xq" (OuterVolumeSpecName: "kube-api-access-257xq") pod "5739844d-c36a-4b25-b5e1-6049a9be36a5" (UID: "5739844d-c36a-4b25-b5e1-6049a9be36a5"). InnerVolumeSpecName "kube-api-access-257xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.061408 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17bc2814-7f95-490f-8ac3-2596dad8acc7-kube-api-access-krcmm" (OuterVolumeSpecName: "kube-api-access-krcmm") pod "17bc2814-7f95-490f-8ac3-2596dad8acc7" (UID: "17bc2814-7f95-490f-8ac3-2596dad8acc7"). InnerVolumeSpecName "kube-api-access-krcmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.160776 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-257xq\" (UniqueName: \"kubernetes.io/projected/5739844d-c36a-4b25-b5e1-6049a9be36a5-kube-api-access-257xq\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.160810 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7zjm\" (UniqueName: \"kubernetes.io/projected/5a338507-ec0d-4723-9eff-8242791ac1e4-kube-api-access-t7zjm\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.160821 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0154728c-361d-44a4-85ca-39167e69fc69-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.160837 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j74k\" (UniqueName: \"kubernetes.io/projected/65872da3-e443-40db-8a63-96aea1382b3f-kube-api-access-6j74k\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.160847 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsxjl\" (UniqueName: \"kubernetes.io/projected/0154728c-361d-44a4-85ca-39167e69fc69-kube-api-access-jsxjl\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.160856 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krcmm\" (UniqueName: \"kubernetes.io/projected/17bc2814-7f95-490f-8ac3-2596dad8acc7-kube-api-access-krcmm\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.160865 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65872da3-e443-40db-8a63-96aea1382b3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.160873 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a338507-ec0d-4723-9eff-8242791ac1e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.339006 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db7e175-0ee7-4017-ac12-dc357676e8e1" path="/var/lib/kubelet/pods/6db7e175-0ee7-4017-ac12-dc357676e8e1/volumes" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.526417 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.527553 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e2fd-account-create-update-5n5s4" event={"ID":"65872da3-e443-40db-8a63-96aea1382b3f","Type":"ContainerDied","Data":"f218185ec11088972c8f5c40016b24e7047bf4301266676a5bb9ccaa219b6dd9"} Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.527586 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f218185ec11088972c8f5c40016b24e7047bf4301266676a5bb9ccaa219b6dd9" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.530173 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9gc2d" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.530858 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9gc2d" event={"ID":"17bc2814-7f95-490f-8ac3-2596dad8acc7","Type":"ContainerDied","Data":"35910ae273212f0f260a5b38781615085b42af3a5d0f0d72f5632f7a8ee9a6f2"} Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.530896 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35910ae273212f0f260a5b38781615085b42af3a5d0f0d72f5632f7a8ee9a6f2" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.533299 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.533308 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4eec-account-create-update-m2z8d" event={"ID":"0154728c-361d-44a4-85ca-39167e69fc69","Type":"ContainerDied","Data":"fbacaa9c4eac72f056eceae3fd8bf92fa99abd45e8242e3fd1e7e6d2d6bea77d"} Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.533354 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbacaa9c4eac72f056eceae3fd8bf92fa99abd45e8242e3fd1e7e6d2d6bea77d" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.535307 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hz4pk" event={"ID":"5739844d-c36a-4b25-b5e1-6049a9be36a5","Type":"ContainerDied","Data":"d1b2aa0e663ce3cdd4a9d3c3bb1fbeb896c99a1d626d72ed2efa9f42d7ab49cf"} Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.535347 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1b2aa0e663ce3cdd4a9d3c3bb1fbeb896c99a1d626d72ed2efa9f42d7ab49cf" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.535388 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hz4pk" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.538064 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerStarted","Data":"c65f7dd151a482e985427acd76e3c32d71156c049057d8b81efaa4d05a6fc45a"} Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.539564 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cxrhx" event={"ID":"33c35f4a-a6c8-4be9-9ff5-558125a8e2bb","Type":"ContainerDied","Data":"f8165ca20d466dae2ea2356d43e072dad7dbabf90d7dcbbd36f81dea11d2b0d6"} Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.539606 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8165ca20d466dae2ea2356d43e072dad7dbabf90d7dcbbd36f81dea11d2b0d6" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.539667 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cxrhx" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.541207 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bbea-account-create-update-t7pck" event={"ID":"5a338507-ec0d-4723-9eff-8242791ac1e4","Type":"ContainerDied","Data":"75973f1f5739768ef239ebdd43b7cd4e5031f3fc7b27de315137e95e9996b124"} Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.541232 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75973f1f5739768ef239ebdd43b7cd4e5031f3fc7b27de315137e95e9996b124" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.541271 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bbea-account-create-update-t7pck" Jan 25 05:55:15 crc kubenswrapper[4728]: I0125 05:55:15.734977 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.553715 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerStarted","Data":"4cbcc41fb66daea6c5a7df7691d3bc173f6a67cbe0ef2baa8ef4ef2877584373"} Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.927172 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.927487 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerName="glance-log" containerID="cri-o://a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae" gracePeriod=30 Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.927679 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerName="glance-httpd" containerID="cri-o://64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce" gracePeriod=30 Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.971668 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zztbh"] Jan 25 05:55:16 crc kubenswrapper[4728]: E0125 05:55:16.971987 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bc2814-7f95-490f-8ac3-2596dad8acc7" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972006 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bc2814-7f95-490f-8ac3-2596dad8acc7" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: E0125 05:55:16.972022 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65872da3-e443-40db-8a63-96aea1382b3f" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972028 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="65872da3-e443-40db-8a63-96aea1382b3f" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: E0125 05:55:16.972040 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0154728c-361d-44a4-85ca-39167e69fc69" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972046 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0154728c-361d-44a4-85ca-39167e69fc69" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: E0125 05:55:16.972058 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c35f4a-a6c8-4be9-9ff5-558125a8e2bb" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972063 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c35f4a-a6c8-4be9-9ff5-558125a8e2bb" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: E0125 05:55:16.972072 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5739844d-c36a-4b25-b5e1-6049a9be36a5" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972078 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5739844d-c36a-4b25-b5e1-6049a9be36a5" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: E0125 05:55:16.972088 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a338507-ec0d-4723-9eff-8242791ac1e4" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972094 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a338507-ec0d-4723-9eff-8242791ac1e4" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972254 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="17bc2814-7f95-490f-8ac3-2596dad8acc7" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972267 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a338507-ec0d-4723-9eff-8242791ac1e4" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972276 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="65872da3-e443-40db-8a63-96aea1382b3f" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972287 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c35f4a-a6c8-4be9-9ff5-558125a8e2bb" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972302 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0154728c-361d-44a4-85ca-39167e69fc69" containerName="mariadb-account-create-update" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972309 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5739844d-c36a-4b25-b5e1-6049a9be36a5" containerName="mariadb-database-create" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.972831 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.986684 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zztbh"] Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.987104 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.987276 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 25 05:55:16 crc kubenswrapper[4728]: I0125 05:55:16.987463 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-w2crb" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.017287 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.017471 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-config-data\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.017659 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-scripts\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.017763 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdn5z\" (UniqueName: \"kubernetes.io/projected/f5da755b-2c82-4436-ab58-bc22b7888ae4-kube-api-access-kdn5z\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.120114 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.120207 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-config-data\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.120274 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-scripts\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.120359 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdn5z\" (UniqueName: \"kubernetes.io/projected/f5da755b-2c82-4436-ab58-bc22b7888ae4-kube-api-access-kdn5z\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.124650 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.125003 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-scripts\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.130736 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-config-data\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.133206 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdn5z\" (UniqueName: \"kubernetes.io/projected/f5da755b-2c82-4436-ab58-bc22b7888ae4-kube-api-access-kdn5z\") pod \"nova-cell0-conductor-db-sync-zztbh\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.301463 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.563821 4728 generic.go:334] "Generic (PLEG): container finished" podID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerID="a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae" exitCode=143 Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.563939 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3989d07c-292d-40ec-ac11-ffce42ffde68","Type":"ContainerDied","Data":"a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae"} Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.566650 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerStarted","Data":"eb6a2eb429a08b589303ef5959e8e745484462279242e8f58a17ca7e580c7b87"} Jan 25 05:55:17 crc kubenswrapper[4728]: I0125 05:55:17.692717 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zztbh"] Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.242093 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.242592 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="de462b89-925b-42f0-9590-a93b2081cc41" containerName="glance-log" containerID="cri-o://87e3a51b46f91c1af523d1cef14e5d09b8e8342f0ca05fb5220400f5eb512081" gracePeriod=30 Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.242720 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="de462b89-925b-42f0-9590-a93b2081cc41" containerName="glance-httpd" containerID="cri-o://568fecdccb03cf4c054e2764aff883c0a4450be038f180f93ddca486d69c8768" gracePeriod=30 Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.581019 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerStarted","Data":"6be35dfe571200c770abb50668d3a413846460235bc68026358515eef8c2d59d"} Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.581207 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="ceilometer-central-agent" containerID="cri-o://c65f7dd151a482e985427acd76e3c32d71156c049057d8b81efaa4d05a6fc45a" gracePeriod=30 Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.581811 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.582082 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="proxy-httpd" containerID="cri-o://6be35dfe571200c770abb50668d3a413846460235bc68026358515eef8c2d59d" gracePeriod=30 Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.582132 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="sg-core" containerID="cri-o://eb6a2eb429a08b589303ef5959e8e745484462279242e8f58a17ca7e580c7b87" gracePeriod=30 Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.582171 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="ceilometer-notification-agent" containerID="cri-o://4cbcc41fb66daea6c5a7df7691d3bc173f6a67cbe0ef2baa8ef4ef2877584373" gracePeriod=30 Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.593647 4728 generic.go:334] "Generic (PLEG): container finished" podID="de462b89-925b-42f0-9590-a93b2081cc41" containerID="87e3a51b46f91c1af523d1cef14e5d09b8e8342f0ca05fb5220400f5eb512081" exitCode=143 Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.593734 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de462b89-925b-42f0-9590-a93b2081cc41","Type":"ContainerDied","Data":"87e3a51b46f91c1af523d1cef14e5d09b8e8342f0ca05fb5220400f5eb512081"} Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.599137 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zztbh" event={"ID":"f5da755b-2c82-4436-ab58-bc22b7888ae4","Type":"ContainerStarted","Data":"b4eab9d9d2b6cc9563557727b1bef25d156f69af828db1f470542e1d2f95aa1d"} Jan 25 05:55:18 crc kubenswrapper[4728]: I0125 05:55:18.603994 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.364451615 podStartE2EDuration="5.603981824s" podCreationTimestamp="2026-01-25 05:55:13 +0000 UTC" firstStartedPulling="2026-01-25 05:55:14.342484581 +0000 UTC m=+1005.378362561" lastFinishedPulling="2026-01-25 05:55:17.58201479 +0000 UTC m=+1008.617892770" observedRunningTime="2026-01-25 05:55:18.602581202 +0000 UTC m=+1009.638459182" watchObservedRunningTime="2026-01-25 05:55:18.603981824 +0000 UTC m=+1009.639859804" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.610574 4728 generic.go:334] "Generic (PLEG): container finished" podID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerID="6be35dfe571200c770abb50668d3a413846460235bc68026358515eef8c2d59d" exitCode=0 Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.610950 4728 generic.go:334] "Generic (PLEG): container finished" podID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerID="eb6a2eb429a08b589303ef5959e8e745484462279242e8f58a17ca7e580c7b87" exitCode=2 Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.610962 4728 generic.go:334] "Generic (PLEG): container finished" podID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerID="4cbcc41fb66daea6c5a7df7691d3bc173f6a67cbe0ef2baa8ef4ef2877584373" exitCode=0 Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.610971 4728 generic.go:334] "Generic (PLEG): container finished" podID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerID="c65f7dd151a482e985427acd76e3c32d71156c049057d8b81efaa4d05a6fc45a" exitCode=0 Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.610659 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerDied","Data":"6be35dfe571200c770abb50668d3a413846460235bc68026358515eef8c2d59d"} Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.611019 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerDied","Data":"eb6a2eb429a08b589303ef5959e8e745484462279242e8f58a17ca7e580c7b87"} Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.611036 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerDied","Data":"4cbcc41fb66daea6c5a7df7691d3bc173f6a67cbe0ef2baa8ef4ef2877584373"} Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.611047 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerDied","Data":"c65f7dd151a482e985427acd76e3c32d71156c049057d8b81efaa4d05a6fc45a"} Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.812169 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.892141 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-sg-core-conf-yaml\") pod \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.892196 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-run-httpd\") pod \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.892219 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-config-data\") pod \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.892379 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-log-httpd\") pod \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.892429 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6drd\" (UniqueName: \"kubernetes.io/projected/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-kube-api-access-z6drd\") pod \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.892484 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-scripts\") pod \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.892565 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-combined-ca-bundle\") pod \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\" (UID: \"dceb6a6b-51c5-4f85-b6a9-2c7001dea867\") " Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.892823 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dceb6a6b-51c5-4f85-b6a9-2c7001dea867" (UID: "dceb6a6b-51c5-4f85-b6a9-2c7001dea867"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.893336 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dceb6a6b-51c5-4f85-b6a9-2c7001dea867" (UID: "dceb6a6b-51c5-4f85-b6a9-2c7001dea867"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.893569 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.893595 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.909019 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-scripts" (OuterVolumeSpecName: "scripts") pod "dceb6a6b-51c5-4f85-b6a9-2c7001dea867" (UID: "dceb6a6b-51c5-4f85-b6a9-2c7001dea867"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.917236 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-kube-api-access-z6drd" (OuterVolumeSpecName: "kube-api-access-z6drd") pod "dceb6a6b-51c5-4f85-b6a9-2c7001dea867" (UID: "dceb6a6b-51c5-4f85-b6a9-2c7001dea867"). InnerVolumeSpecName "kube-api-access-z6drd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.918720 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dceb6a6b-51c5-4f85-b6a9-2c7001dea867" (UID: "dceb6a6b-51c5-4f85-b6a9-2c7001dea867"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.948655 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dceb6a6b-51c5-4f85-b6a9-2c7001dea867" (UID: "dceb6a6b-51c5-4f85-b6a9-2c7001dea867"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.988794 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-config-data" (OuterVolumeSpecName: "config-data") pod "dceb6a6b-51c5-4f85-b6a9-2c7001dea867" (UID: "dceb6a6b-51c5-4f85-b6a9-2c7001dea867"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.995575 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.995610 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.995623 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.995633 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:19 crc kubenswrapper[4728]: I0125 05:55:19.995641 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6drd\" (UniqueName: \"kubernetes.io/projected/dceb6a6b-51c5-4f85-b6a9-2c7001dea867-kube-api-access-z6drd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.471902 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.525903 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-httpd-run\") pod \"3989d07c-292d-40ec-ac11-ffce42ffde68\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.525987 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"3989d07c-292d-40ec-ac11-ffce42ffde68\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.526072 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-combined-ca-bundle\") pod \"3989d07c-292d-40ec-ac11-ffce42ffde68\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.526140 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-scripts\") pod \"3989d07c-292d-40ec-ac11-ffce42ffde68\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.526189 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-logs\") pod \"3989d07c-292d-40ec-ac11-ffce42ffde68\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.526252 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx7hd\" (UniqueName: \"kubernetes.io/projected/3989d07c-292d-40ec-ac11-ffce42ffde68-kube-api-access-gx7hd\") pod \"3989d07c-292d-40ec-ac11-ffce42ffde68\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.526279 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-config-data\") pod \"3989d07c-292d-40ec-ac11-ffce42ffde68\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.526363 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-public-tls-certs\") pod \"3989d07c-292d-40ec-ac11-ffce42ffde68\" (UID: \"3989d07c-292d-40ec-ac11-ffce42ffde68\") " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.529010 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3989d07c-292d-40ec-ac11-ffce42ffde68" (UID: "3989d07c-292d-40ec-ac11-ffce42ffde68"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.529427 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-logs" (OuterVolumeSpecName: "logs") pod "3989d07c-292d-40ec-ac11-ffce42ffde68" (UID: "3989d07c-292d-40ec-ac11-ffce42ffde68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.532169 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "3989d07c-292d-40ec-ac11-ffce42ffde68" (UID: "3989d07c-292d-40ec-ac11-ffce42ffde68"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.534387 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3989d07c-292d-40ec-ac11-ffce42ffde68-kube-api-access-gx7hd" (OuterVolumeSpecName: "kube-api-access-gx7hd") pod "3989d07c-292d-40ec-ac11-ffce42ffde68" (UID: "3989d07c-292d-40ec-ac11-ffce42ffde68"). InnerVolumeSpecName "kube-api-access-gx7hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.560717 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-scripts" (OuterVolumeSpecName: "scripts") pod "3989d07c-292d-40ec-ac11-ffce42ffde68" (UID: "3989d07c-292d-40ec-ac11-ffce42ffde68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.564928 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3989d07c-292d-40ec-ac11-ffce42ffde68" (UID: "3989d07c-292d-40ec-ac11-ffce42ffde68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.577149 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3989d07c-292d-40ec-ac11-ffce42ffde68" (UID: "3989d07c-292d-40ec-ac11-ffce42ffde68"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.593289 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-config-data" (OuterVolumeSpecName: "config-data") pod "3989d07c-292d-40ec-ac11-ffce42ffde68" (UID: "3989d07c-292d-40ec-ac11-ffce42ffde68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.634036 4728 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.634066 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.634098 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.634109 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.634118 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.634127 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3989d07c-292d-40ec-ac11-ffce42ffde68-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.634137 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx7hd\" (UniqueName: \"kubernetes.io/projected/3989d07c-292d-40ec-ac11-ffce42ffde68-kube-api-access-gx7hd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.634145 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3989d07c-292d-40ec-ac11-ffce42ffde68-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.635568 4728 generic.go:334] "Generic (PLEG): container finished" podID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerID="64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce" exitCode=0 Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.635724 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.639448 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3989d07c-292d-40ec-ac11-ffce42ffde68","Type":"ContainerDied","Data":"64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce"} Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.639485 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3989d07c-292d-40ec-ac11-ffce42ffde68","Type":"ContainerDied","Data":"19bc42390f2c1504c99588b1d8e0829bad251a73e59fee27cac498423f9d52a7"} Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.639501 4728 scope.go:117] "RemoveContainer" containerID="64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.646551 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dceb6a6b-51c5-4f85-b6a9-2c7001dea867","Type":"ContainerDied","Data":"e85e8e4e14d734efc00c5c2f7d7ba2a6b3e8e4a2852536a21dc9f703fb39b24f"} Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.646622 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.657063 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.688397 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.695689 4728 scope.go:117] "RemoveContainer" containerID="a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.700293 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.706544 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.718810 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.736439 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.756533 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:20 crc kubenswrapper[4728]: E0125 05:55:20.757593 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="ceilometer-notification-agent" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.757628 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="ceilometer-notification-agent" Jan 25 05:55:20 crc kubenswrapper[4728]: E0125 05:55:20.757656 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="ceilometer-central-agent" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.757663 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="ceilometer-central-agent" Jan 25 05:55:20 crc kubenswrapper[4728]: E0125 05:55:20.757683 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="proxy-httpd" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.757693 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="proxy-httpd" Jan 25 05:55:20 crc kubenswrapper[4728]: E0125 05:55:20.757716 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerName="glance-log" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.757725 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerName="glance-log" Jan 25 05:55:20 crc kubenswrapper[4728]: E0125 05:55:20.757745 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerName="glance-httpd" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.757752 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerName="glance-httpd" Jan 25 05:55:20 crc kubenswrapper[4728]: E0125 05:55:20.757771 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="sg-core" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.757778 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="sg-core" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.758138 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerName="glance-httpd" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.758170 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="ceilometer-central-agent" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.758182 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="ceilometer-notification-agent" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.758198 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="proxy-httpd" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.758215 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" containerName="glance-log" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.758228 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" containerName="sg-core" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.758887 4728 scope.go:117] "RemoveContainer" containerID="64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce" Jan 25 05:55:20 crc kubenswrapper[4728]: E0125 05:55:20.759648 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce\": container with ID starting with 64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce not found: ID does not exist" containerID="64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.759711 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce"} err="failed to get container status \"64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce\": rpc error: code = NotFound desc = could not find container \"64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce\": container with ID starting with 64255d66f4b86385c2b0cbd2d7ae591493f4c8935f31f934ee8348d7f5f26dce not found: ID does not exist" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.760085 4728 scope.go:117] "RemoveContainer" containerID="a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae" Jan 25 05:55:20 crc kubenswrapper[4728]: E0125 05:55:20.760650 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae\": container with ID starting with a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae not found: ID does not exist" containerID="a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.760699 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae"} err="failed to get container status \"a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae\": rpc error: code = NotFound desc = could not find container \"a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae\": container with ID starting with a36d3916d5761ed3612097e62f069850c20dbb3d503fc5b8a91a2f7211db87ae not found: ID does not exist" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.760723 4728 scope.go:117] "RemoveContainer" containerID="6be35dfe571200c770abb50668d3a413846460235bc68026358515eef8c2d59d" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.762274 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.765301 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.765479 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.782096 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.785330 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.786974 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.788193 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.791094 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.808289 4728 scope.go:117] "RemoveContainer" containerID="eb6a2eb429a08b589303ef5959e8e745484462279242e8f58a17ca7e580c7b87" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.814085 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.838408 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-config-data\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.838517 4728 scope.go:117] "RemoveContainer" containerID="4cbcc41fb66daea6c5a7df7691d3bc173f6a67cbe0ef2baa8ef4ef2877584373" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.838568 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-run-httpd\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.838794 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knwlx\" (UniqueName: \"kubernetes.io/projected/641ae513-9950-47c5-8c12-2061567f6e53-kube-api-access-knwlx\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.838965 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-scripts\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.839003 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-log-httpd\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.839023 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.839051 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.856098 4728 scope.go:117] "RemoveContainer" containerID="c65f7dd151a482e985427acd76e3c32d71156c049057d8b81efaa4d05a6fc45a" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941147 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-config-data\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941196 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-config-data\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941223 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc5ww\" (UniqueName: \"kubernetes.io/projected/8fbf2f2e-5205-4c3d-8b05-185404930c85-kube-api-access-wc5ww\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941247 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fbf2f2e-5205-4c3d-8b05-185404930c85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941285 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-run-httpd\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941310 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fbf2f2e-5205-4c3d-8b05-185404930c85-logs\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941338 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-scripts\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941376 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941410 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knwlx\" (UniqueName: \"kubernetes.io/projected/641ae513-9950-47c5-8c12-2061567f6e53-kube-api-access-knwlx\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941445 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941474 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941498 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-scripts\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941519 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-log-httpd\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941534 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.941551 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.945940 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-log-httpd\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.946002 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-run-httpd\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.946621 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.950729 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-config-data\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.951553 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-scripts\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.951706 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:20 crc kubenswrapper[4728]: I0125 05:55:20.959521 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knwlx\" (UniqueName: \"kubernetes.io/projected/641ae513-9950-47c5-8c12-2061567f6e53-kube-api-access-knwlx\") pod \"ceilometer-0\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " pod="openstack/ceilometer-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.043661 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-config-data\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.043719 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc5ww\" (UniqueName: \"kubernetes.io/projected/8fbf2f2e-5205-4c3d-8b05-185404930c85-kube-api-access-wc5ww\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.043743 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fbf2f2e-5205-4c3d-8b05-185404930c85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.043790 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fbf2f2e-5205-4c3d-8b05-185404930c85-logs\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.043812 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-scripts\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.043857 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.043897 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.043932 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.044585 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fbf2f2e-5205-4c3d-8b05-185404930c85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.044596 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.044810 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fbf2f2e-5205-4c3d-8b05-185404930c85-logs\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.047131 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-scripts\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.048068 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.048149 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-config-data\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.049196 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf2f2e-5205-4c3d-8b05-185404930c85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.056353 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc5ww\" (UniqueName: \"kubernetes.io/projected/8fbf2f2e-5205-4c3d-8b05-185404930c85-kube-api-access-wc5ww\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.084419 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.085928 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8fbf2f2e-5205-4c3d-8b05-185404930c85\") " pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.117468 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.339535 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3989d07c-292d-40ec-ac11-ffce42ffde68" path="/var/lib/kubelet/pods/3989d07c-292d-40ec-ac11-ffce42ffde68/volumes" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.340236 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dceb6a6b-51c5-4f85-b6a9-2c7001dea867" path="/var/lib/kubelet/pods/dceb6a6b-51c5-4f85-b6a9-2c7001dea867/volumes" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.583107 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:21 crc kubenswrapper[4728]: W0125 05:55:21.587398 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod641ae513_9950_47c5_8c12_2061567f6e53.slice/crio-5cc905ef97fdbd0fa2b9699a33ad868240357dd67e0f4af8ec4c484006421d52 WatchSource:0}: Error finding container 5cc905ef97fdbd0fa2b9699a33ad868240357dd67e0f4af8ec4c484006421d52: Status 404 returned error can't find the container with id 5cc905ef97fdbd0fa2b9699a33ad868240357dd67e0f4af8ec4c484006421d52 Jan 25 05:55:21 crc kubenswrapper[4728]: E0125 05:55:21.595387 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde462b89_925b_42f0_9590_a93b2081cc41.slice/crio-568fecdccb03cf4c054e2764aff883c0a4450be038f180f93ddca486d69c8768.scope\": RecentStats: unable to find data in memory cache]" Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.730566 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerStarted","Data":"5cc905ef97fdbd0fa2b9699a33ad868240357dd67e0f4af8ec4c484006421d52"} Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.757826 4728 generic.go:334] "Generic (PLEG): container finished" podID="de462b89-925b-42f0-9590-a93b2081cc41" containerID="568fecdccb03cf4c054e2764aff883c0a4450be038f180f93ddca486d69c8768" exitCode=0 Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.757910 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de462b89-925b-42f0-9590-a93b2081cc41","Type":"ContainerDied","Data":"568fecdccb03cf4c054e2764aff883c0a4450be038f180f93ddca486d69c8768"} Jan 25 05:55:21 crc kubenswrapper[4728]: I0125 05:55:21.865736 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.028811 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.173156 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-httpd-run\") pod \"de462b89-925b-42f0-9590-a93b2081cc41\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.173287 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6xqd\" (UniqueName: \"kubernetes.io/projected/de462b89-925b-42f0-9590-a93b2081cc41-kube-api-access-r6xqd\") pod \"de462b89-925b-42f0-9590-a93b2081cc41\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.173378 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"de462b89-925b-42f0-9590-a93b2081cc41\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.173449 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-combined-ca-bundle\") pod \"de462b89-925b-42f0-9590-a93b2081cc41\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.173515 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-logs\") pod \"de462b89-925b-42f0-9590-a93b2081cc41\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.173572 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-config-data\") pod \"de462b89-925b-42f0-9590-a93b2081cc41\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.173588 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-internal-tls-certs\") pod \"de462b89-925b-42f0-9590-a93b2081cc41\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.173695 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-scripts\") pod \"de462b89-925b-42f0-9590-a93b2081cc41\" (UID: \"de462b89-925b-42f0-9590-a93b2081cc41\") " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.178601 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "de462b89-925b-42f0-9590-a93b2081cc41" (UID: "de462b89-925b-42f0-9590-a93b2081cc41"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.178689 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-logs" (OuterVolumeSpecName: "logs") pod "de462b89-925b-42f0-9590-a93b2081cc41" (UID: "de462b89-925b-42f0-9590-a93b2081cc41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.181504 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de462b89-925b-42f0-9590-a93b2081cc41-kube-api-access-r6xqd" (OuterVolumeSpecName: "kube-api-access-r6xqd") pod "de462b89-925b-42f0-9590-a93b2081cc41" (UID: "de462b89-925b-42f0-9590-a93b2081cc41"). InnerVolumeSpecName "kube-api-access-r6xqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.183576 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-scripts" (OuterVolumeSpecName: "scripts") pod "de462b89-925b-42f0-9590-a93b2081cc41" (UID: "de462b89-925b-42f0-9590-a93b2081cc41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.187287 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "de462b89-925b-42f0-9590-a93b2081cc41" (UID: "de462b89-925b-42f0-9590-a93b2081cc41"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.209473 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de462b89-925b-42f0-9590-a93b2081cc41" (UID: "de462b89-925b-42f0-9590-a93b2081cc41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.226855 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-config-data" (OuterVolumeSpecName: "config-data") pod "de462b89-925b-42f0-9590-a93b2081cc41" (UID: "de462b89-925b-42f0-9590-a93b2081cc41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.229473 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "de462b89-925b-42f0-9590-a93b2081cc41" (UID: "de462b89-925b-42f0-9590-a93b2081cc41"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.276250 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.276284 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.276295 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.276306 4728 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.276337 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de462b89-925b-42f0-9590-a93b2081cc41-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.276347 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de462b89-925b-42f0-9590-a93b2081cc41-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.276359 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6xqd\" (UniqueName: \"kubernetes.io/projected/de462b89-925b-42f0-9590-a93b2081cc41-kube-api-access-r6xqd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.276396 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.294452 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.349483 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.377883 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.802026 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de462b89-925b-42f0-9590-a93b2081cc41","Type":"ContainerDied","Data":"73bd08adb260d4d7e2fdf8ca7e14d40eb8528b4f961c790504608541f17632a1"} Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.802082 4728 scope.go:117] "RemoveContainer" containerID="568fecdccb03cf4c054e2764aff883c0a4450be038f180f93ddca486d69c8768" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.802450 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.805716 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8fbf2f2e-5205-4c3d-8b05-185404930c85","Type":"ContainerStarted","Data":"6a34828e9899ccb0cefa671a9d7f2a00999abc916a42a0979ea2f9a81de4df05"} Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.805765 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8fbf2f2e-5205-4c3d-8b05-185404930c85","Type":"ContainerStarted","Data":"b8725b82c9ffeaf97865fa897670415f817f70d869d2b30e1ff73d6e64ee316c"} Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.809044 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerStarted","Data":"845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef"} Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.838727 4728 scope.go:117] "RemoveContainer" containerID="87e3a51b46f91c1af523d1cef14e5d09b8e8342f0ca05fb5220400f5eb512081" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.840773 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.852641 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.869681 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:55:22 crc kubenswrapper[4728]: E0125 05:55:22.870133 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de462b89-925b-42f0-9590-a93b2081cc41" containerName="glance-log" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.870150 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="de462b89-925b-42f0-9590-a93b2081cc41" containerName="glance-log" Jan 25 05:55:22 crc kubenswrapper[4728]: E0125 05:55:22.870179 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de462b89-925b-42f0-9590-a93b2081cc41" containerName="glance-httpd" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.870185 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="de462b89-925b-42f0-9590-a93b2081cc41" containerName="glance-httpd" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.870424 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="de462b89-925b-42f0-9590-a93b2081cc41" containerName="glance-httpd" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.870444 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="de462b89-925b-42f0-9590-a93b2081cc41" containerName="glance-log" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.871466 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.874418 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.877807 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.882372 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.988225 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/036ddc84-2b06-4817-9afd-537d8ed82150-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.988333 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.988368 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-config-data\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.988402 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.988435 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.988461 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/036ddc84-2b06-4817-9afd-537d8ed82150-logs\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.988523 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cldrz\" (UniqueName: \"kubernetes.io/projected/036ddc84-2b06-4817-9afd-537d8ed82150-kube-api-access-cldrz\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:22 crc kubenswrapper[4728]: I0125 05:55:22.988600 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-scripts\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.090116 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-scripts\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.090194 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/036ddc84-2b06-4817-9afd-537d8ed82150-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.090232 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.090254 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-config-data\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.090282 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.090305 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.090337 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/036ddc84-2b06-4817-9afd-537d8ed82150-logs\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.090371 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cldrz\" (UniqueName: \"kubernetes.io/projected/036ddc84-2b06-4817-9afd-537d8ed82150-kube-api-access-cldrz\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.091641 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/036ddc84-2b06-4817-9afd-537d8ed82150-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.092034 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.092824 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/036ddc84-2b06-4817-9afd-537d8ed82150-logs\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.096388 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-scripts\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.096447 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.096763 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.101378 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036ddc84-2b06-4817-9afd-537d8ed82150-config-data\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.105586 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cldrz\" (UniqueName: \"kubernetes.io/projected/036ddc84-2b06-4817-9afd-537d8ed82150-kube-api-access-cldrz\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.134972 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"036ddc84-2b06-4817-9afd-537d8ed82150\") " pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.197822 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.339234 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de462b89-925b-42f0-9590-a93b2081cc41" path="/var/lib/kubelet/pods/de462b89-925b-42f0-9590-a93b2081cc41/volumes" Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.711020 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.834926 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8fbf2f2e-5205-4c3d-8b05-185404930c85","Type":"ContainerStarted","Data":"29028aca2bb9bb6af55181e9c4aefa03d541df55e89f9dd99480a4c0c330ee62"} Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.838455 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"036ddc84-2b06-4817-9afd-537d8ed82150","Type":"ContainerStarted","Data":"27a7176c1bd3b9f41fd081a96bb7c6c51bfa44873946b22dd2a49cdae0a93023"} Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.848731 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerStarted","Data":"4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6"} Jan 25 05:55:23 crc kubenswrapper[4728]: I0125 05:55:23.860642 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.86062979 podStartE2EDuration="3.86062979s" podCreationTimestamp="2026-01-25 05:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:55:23.851576357 +0000 UTC m=+1014.887454337" watchObservedRunningTime="2026-01-25 05:55:23.86062979 +0000 UTC m=+1014.896507770" Jan 25 05:55:24 crc kubenswrapper[4728]: I0125 05:55:24.873380 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"036ddc84-2b06-4817-9afd-537d8ed82150","Type":"ContainerStarted","Data":"eb19846cdf476193fe5254211abb8e8d4216d1806e97d60a3032c42a9807978f"} Jan 25 05:55:29 crc kubenswrapper[4728]: I0125 05:55:29.914180 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zztbh" event={"ID":"f5da755b-2c82-4436-ab58-bc22b7888ae4","Type":"ContainerStarted","Data":"8827d8f2a96d7732dea844b3c9fae235e05bcd2bc26068b2908e3ffc81948cc1"} Jan 25 05:55:29 crc kubenswrapper[4728]: I0125 05:55:29.915801 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerStarted","Data":"271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b"} Jan 25 05:55:29 crc kubenswrapper[4728]: I0125 05:55:29.928196 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zztbh" podStartSLOduration=1.995833279 podStartE2EDuration="13.928182068s" podCreationTimestamp="2026-01-25 05:55:16 +0000 UTC" firstStartedPulling="2026-01-25 05:55:17.709035225 +0000 UTC m=+1008.744913205" lastFinishedPulling="2026-01-25 05:55:29.641384015 +0000 UTC m=+1020.677261994" observedRunningTime="2026-01-25 05:55:29.927782785 +0000 UTC m=+1020.963660764" watchObservedRunningTime="2026-01-25 05:55:29.928182068 +0000 UTC m=+1020.964060048" Jan 25 05:55:30 crc kubenswrapper[4728]: I0125 05:55:30.937522 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"036ddc84-2b06-4817-9afd-537d8ed82150","Type":"ContainerStarted","Data":"59ad9f8b203da3e51be5844d400241d7d1067f26f07ad0046065064f25d12fe3"} Jan 25 05:55:30 crc kubenswrapper[4728]: I0125 05:55:30.955598 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.955577647 podStartE2EDuration="8.955577647s" podCreationTimestamp="2026-01-25 05:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:55:30.954428201 +0000 UTC m=+1021.990306181" watchObservedRunningTime="2026-01-25 05:55:30.955577647 +0000 UTC m=+1021.991455628" Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.117795 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.117838 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.148746 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.159810 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.958787 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerStarted","Data":"326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95"} Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.959123 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.959153 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.959227 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="ceilometer-central-agent" containerID="cri-o://845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef" gracePeriod=30 Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.959604 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="proxy-httpd" containerID="cri-o://326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95" gracePeriod=30 Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.959659 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="sg-core" containerID="cri-o://271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b" gracePeriod=30 Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.959698 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="ceilometer-notification-agent" containerID="cri-o://4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6" gracePeriod=30 Jan 25 05:55:31 crc kubenswrapper[4728]: I0125 05:55:31.985290 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.698205372 podStartE2EDuration="11.985276492s" podCreationTimestamp="2026-01-25 05:55:20 +0000 UTC" firstStartedPulling="2026-01-25 05:55:21.600786592 +0000 UTC m=+1012.636664562" lastFinishedPulling="2026-01-25 05:55:30.887857701 +0000 UTC m=+1021.923735682" observedRunningTime="2026-01-25 05:55:31.981055282 +0000 UTC m=+1023.016933261" watchObservedRunningTime="2026-01-25 05:55:31.985276492 +0000 UTC m=+1023.021154472" Jan 25 05:55:32 crc kubenswrapper[4728]: I0125 05:55:32.977853 4728 generic.go:334] "Generic (PLEG): container finished" podID="641ae513-9950-47c5-8c12-2061567f6e53" containerID="326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95" exitCode=0 Jan 25 05:55:32 crc kubenswrapper[4728]: I0125 05:55:32.977894 4728 generic.go:334] "Generic (PLEG): container finished" podID="641ae513-9950-47c5-8c12-2061567f6e53" containerID="271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b" exitCode=2 Jan 25 05:55:32 crc kubenswrapper[4728]: I0125 05:55:32.977926 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerDied","Data":"326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95"} Jan 25 05:55:32 crc kubenswrapper[4728]: I0125 05:55:32.977977 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerDied","Data":"271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b"} Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.199004 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.199060 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.244030 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.253992 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.476100 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.617801 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.986688 4728 generic.go:334] "Generic (PLEG): container finished" podID="641ae513-9950-47c5-8c12-2061567f6e53" containerID="845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef" exitCode=0 Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.986767 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerDied","Data":"845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef"} Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.987535 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:33 crc kubenswrapper[4728]: I0125 05:55:33.987585 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.463310 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.630543 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knwlx\" (UniqueName: \"kubernetes.io/projected/641ae513-9950-47c5-8c12-2061567f6e53-kube-api-access-knwlx\") pod \"641ae513-9950-47c5-8c12-2061567f6e53\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.630624 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-log-httpd\") pod \"641ae513-9950-47c5-8c12-2061567f6e53\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.630812 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-combined-ca-bundle\") pod \"641ae513-9950-47c5-8c12-2061567f6e53\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.630916 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-sg-core-conf-yaml\") pod \"641ae513-9950-47c5-8c12-2061567f6e53\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.630950 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-run-httpd\") pod \"641ae513-9950-47c5-8c12-2061567f6e53\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.630971 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-scripts\") pod \"641ae513-9950-47c5-8c12-2061567f6e53\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.630989 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-config-data\") pod \"641ae513-9950-47c5-8c12-2061567f6e53\" (UID: \"641ae513-9950-47c5-8c12-2061567f6e53\") " Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.631136 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "641ae513-9950-47c5-8c12-2061567f6e53" (UID: "641ae513-9950-47c5-8c12-2061567f6e53"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.631257 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "641ae513-9950-47c5-8c12-2061567f6e53" (UID: "641ae513-9950-47c5-8c12-2061567f6e53"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.631669 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.631688 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/641ae513-9950-47c5-8c12-2061567f6e53-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.638271 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/641ae513-9950-47c5-8c12-2061567f6e53-kube-api-access-knwlx" (OuterVolumeSpecName: "kube-api-access-knwlx") pod "641ae513-9950-47c5-8c12-2061567f6e53" (UID: "641ae513-9950-47c5-8c12-2061567f6e53"). InnerVolumeSpecName "kube-api-access-knwlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.641415 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-scripts" (OuterVolumeSpecName: "scripts") pod "641ae513-9950-47c5-8c12-2061567f6e53" (UID: "641ae513-9950-47c5-8c12-2061567f6e53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.656490 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "641ae513-9950-47c5-8c12-2061567f6e53" (UID: "641ae513-9950-47c5-8c12-2061567f6e53"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.682587 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "641ae513-9950-47c5-8c12-2061567f6e53" (UID: "641ae513-9950-47c5-8c12-2061567f6e53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.700377 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-config-data" (OuterVolumeSpecName: "config-data") pod "641ae513-9950-47c5-8c12-2061567f6e53" (UID: "641ae513-9950-47c5-8c12-2061567f6e53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.736422 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knwlx\" (UniqueName: \"kubernetes.io/projected/641ae513-9950-47c5-8c12-2061567f6e53-kube-api-access-knwlx\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.736456 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.736466 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.736481 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.736490 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/641ae513-9950-47c5-8c12-2061567f6e53-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.996426 4728 generic.go:334] "Generic (PLEG): container finished" podID="641ae513-9950-47c5-8c12-2061567f6e53" containerID="4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6" exitCode=0 Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.996766 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerDied","Data":"4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6"} Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.996819 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"641ae513-9950-47c5-8c12-2061567f6e53","Type":"ContainerDied","Data":"5cc905ef97fdbd0fa2b9699a33ad868240357dd67e0f4af8ec4c484006421d52"} Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.996840 4728 scope.go:117] "RemoveContainer" containerID="326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95" Jan 25 05:55:34 crc kubenswrapper[4728]: I0125 05:55:34.997001 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.017559 4728 scope.go:117] "RemoveContainer" containerID="271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.027339 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.032896 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.049302 4728 scope.go:117] "RemoveContainer" containerID="4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058076 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:35 crc kubenswrapper[4728]: E0125 05:55:35.058533 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="sg-core" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058551 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="sg-core" Jan 25 05:55:35 crc kubenswrapper[4728]: E0125 05:55:35.058570 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="ceilometer-notification-agent" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058576 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="ceilometer-notification-agent" Jan 25 05:55:35 crc kubenswrapper[4728]: E0125 05:55:35.058595 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="proxy-httpd" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058601 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="proxy-httpd" Jan 25 05:55:35 crc kubenswrapper[4728]: E0125 05:55:35.058624 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="ceilometer-central-agent" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058637 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="ceilometer-central-agent" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058830 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="sg-core" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058849 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="ceilometer-notification-agent" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058864 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="ceilometer-central-agent" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.058876 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="641ae513-9950-47c5-8c12-2061567f6e53" containerName="proxy-httpd" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.060683 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.062868 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.063042 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.073922 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.093524 4728 scope.go:117] "RemoveContainer" containerID="845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.109508 4728 scope.go:117] "RemoveContainer" containerID="326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95" Jan 25 05:55:35 crc kubenswrapper[4728]: E0125 05:55:35.110524 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95\": container with ID starting with 326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95 not found: ID does not exist" containerID="326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.110598 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95"} err="failed to get container status \"326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95\": rpc error: code = NotFound desc = could not find container \"326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95\": container with ID starting with 326539ac43ce5a6fc032b242d73b109f915e4c50c498d7404c5a085227586b95 not found: ID does not exist" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.110643 4728 scope.go:117] "RemoveContainer" containerID="271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b" Jan 25 05:55:35 crc kubenswrapper[4728]: E0125 05:55:35.111222 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b\": container with ID starting with 271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b not found: ID does not exist" containerID="271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.111264 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b"} err="failed to get container status \"271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b\": rpc error: code = NotFound desc = could not find container \"271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b\": container with ID starting with 271cc1dbd04a4c780aea892bb2a134be0e80a84b5ec11dd46f034347066fb48b not found: ID does not exist" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.111291 4728 scope.go:117] "RemoveContainer" containerID="4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6" Jan 25 05:55:35 crc kubenswrapper[4728]: E0125 05:55:35.111622 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6\": container with ID starting with 4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6 not found: ID does not exist" containerID="4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.111659 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6"} err="failed to get container status \"4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6\": rpc error: code = NotFound desc = could not find container \"4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6\": container with ID starting with 4bfc6956eb6a1b01486ff3b71b7b66f1d341b3574b447d769bda655d86acd9b6 not found: ID does not exist" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.111698 4728 scope.go:117] "RemoveContainer" containerID="845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef" Jan 25 05:55:35 crc kubenswrapper[4728]: E0125 05:55:35.111968 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef\": container with ID starting with 845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef not found: ID does not exist" containerID="845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.112003 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef"} err="failed to get container status \"845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef\": rpc error: code = NotFound desc = could not find container \"845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef\": container with ID starting with 845940de916555c118d610c1954c1c5bde62d73441e9432c8222ad099605adef not found: ID does not exist" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.248079 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-run-httpd\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.248114 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-log-httpd\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.248202 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-scripts\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.248307 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.248355 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.248378 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9k7t\" (UniqueName: \"kubernetes.io/projected/7990d8bf-fbf2-479c-9bfb-690fa3141dad-kube-api-access-z9k7t\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.248409 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-config-data\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.338718 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="641ae513-9950-47c5-8c12-2061567f6e53" path="/var/lib/kubelet/pods/641ae513-9950-47c5-8c12-2061567f6e53/volumes" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.349637 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-scripts\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.349704 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.349731 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.349749 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9k7t\" (UniqueName: \"kubernetes.io/projected/7990d8bf-fbf2-479c-9bfb-690fa3141dad-kube-api-access-z9k7t\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.349771 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-config-data\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.349814 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-run-httpd\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.349830 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-log-httpd\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.350173 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-log-httpd\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.351180 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-run-httpd\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.362355 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-scripts\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.362527 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-config-data\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.362836 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.365903 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.366564 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9k7t\" (UniqueName: \"kubernetes.io/projected/7990d8bf-fbf2-479c-9bfb-690fa3141dad-kube-api-access-z9k7t\") pod \"ceilometer-0\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.391223 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.627407 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:35 crc kubenswrapper[4728]: I0125 05:55:35.800107 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:55:36 crc kubenswrapper[4728]: I0125 05:55:36.010086 4728 generic.go:334] "Generic (PLEG): container finished" podID="f5da755b-2c82-4436-ab58-bc22b7888ae4" containerID="8827d8f2a96d7732dea844b3c9fae235e05bcd2bc26068b2908e3ffc81948cc1" exitCode=0 Jan 25 05:55:36 crc kubenswrapper[4728]: I0125 05:55:36.010187 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zztbh" event={"ID":"f5da755b-2c82-4436-ab58-bc22b7888ae4","Type":"ContainerDied","Data":"8827d8f2a96d7732dea844b3c9fae235e05bcd2bc26068b2908e3ffc81948cc1"} Jan 25 05:55:36 crc kubenswrapper[4728]: I0125 05:55:36.012182 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerStarted","Data":"d2bf1a971dda651d95fd20610bb321b8aa8424a8a7a8ee8962a646661e40d202"} Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.022171 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerStarted","Data":"cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3"} Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.482572 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.616413 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-config-data\") pod \"f5da755b-2c82-4436-ab58-bc22b7888ae4\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.616464 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-scripts\") pod \"f5da755b-2c82-4436-ab58-bc22b7888ae4\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.616508 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-combined-ca-bundle\") pod \"f5da755b-2c82-4436-ab58-bc22b7888ae4\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.616577 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdn5z\" (UniqueName: \"kubernetes.io/projected/f5da755b-2c82-4436-ab58-bc22b7888ae4-kube-api-access-kdn5z\") pod \"f5da755b-2c82-4436-ab58-bc22b7888ae4\" (UID: \"f5da755b-2c82-4436-ab58-bc22b7888ae4\") " Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.623144 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5da755b-2c82-4436-ab58-bc22b7888ae4-kube-api-access-kdn5z" (OuterVolumeSpecName: "kube-api-access-kdn5z") pod "f5da755b-2c82-4436-ab58-bc22b7888ae4" (UID: "f5da755b-2c82-4436-ab58-bc22b7888ae4"). InnerVolumeSpecName "kube-api-access-kdn5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.623369 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-scripts" (OuterVolumeSpecName: "scripts") pod "f5da755b-2c82-4436-ab58-bc22b7888ae4" (UID: "f5da755b-2c82-4436-ab58-bc22b7888ae4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.648434 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-config-data" (OuterVolumeSpecName: "config-data") pod "f5da755b-2c82-4436-ab58-bc22b7888ae4" (UID: "f5da755b-2c82-4436-ab58-bc22b7888ae4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.653281 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5da755b-2c82-4436-ab58-bc22b7888ae4" (UID: "f5da755b-2c82-4436-ab58-bc22b7888ae4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.662894 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.723309 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.723379 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdn5z\" (UniqueName: \"kubernetes.io/projected/f5da755b-2c82-4436-ab58-bc22b7888ae4-kube-api-access-kdn5z\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.723399 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:37 crc kubenswrapper[4728]: I0125 05:55:37.723409 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5da755b-2c82-4436-ab58-bc22b7888ae4-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.031358 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zztbh" event={"ID":"f5da755b-2c82-4436-ab58-bc22b7888ae4","Type":"ContainerDied","Data":"b4eab9d9d2b6cc9563557727b1bef25d156f69af828db1f470542e1d2f95aa1d"} Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.031438 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4eab9d9d2b6cc9563557727b1bef25d156f69af828db1f470542e1d2f95aa1d" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.031384 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zztbh" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.034707 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerStarted","Data":"8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4"} Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.113126 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 25 05:55:38 crc kubenswrapper[4728]: E0125 05:55:38.113704 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5da755b-2c82-4436-ab58-bc22b7888ae4" containerName="nova-cell0-conductor-db-sync" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.113726 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5da755b-2c82-4436-ab58-bc22b7888ae4" containerName="nova-cell0-conductor-db-sync" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.113945 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5da755b-2c82-4436-ab58-bc22b7888ae4" containerName="nova-cell0-conductor-db-sync" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.114701 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.116952 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.117183 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-w2crb" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.128359 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.231957 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7wq6\" (UniqueName: \"kubernetes.io/projected/34d58c30-a0dd-40da-94b2-ab3cba2038ad-kube-api-access-d7wq6\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.232184 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d58c30-a0dd-40da-94b2-ab3cba2038ad-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.232355 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d58c30-a0dd-40da-94b2-ab3cba2038ad-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.335303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d58c30-a0dd-40da-94b2-ab3cba2038ad-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.335669 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7wq6\" (UniqueName: \"kubernetes.io/projected/34d58c30-a0dd-40da-94b2-ab3cba2038ad-kube-api-access-d7wq6\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.335752 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d58c30-a0dd-40da-94b2-ab3cba2038ad-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.339937 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d58c30-a0dd-40da-94b2-ab3cba2038ad-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.343909 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d58c30-a0dd-40da-94b2-ab3cba2038ad-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.352299 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7wq6\" (UniqueName: \"kubernetes.io/projected/34d58c30-a0dd-40da-94b2-ab3cba2038ad-kube-api-access-d7wq6\") pod \"nova-cell0-conductor-0\" (UID: \"34d58c30-a0dd-40da-94b2-ab3cba2038ad\") " pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.428766 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:38 crc kubenswrapper[4728]: I0125 05:55:38.857479 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 25 05:55:39 crc kubenswrapper[4728]: I0125 05:55:39.042540 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"34d58c30-a0dd-40da-94b2-ab3cba2038ad","Type":"ContainerStarted","Data":"6a660025044d69babe8237f7318cd37be097e998f5a5032066cedd4c2bddbc14"} Jan 25 05:55:40 crc kubenswrapper[4728]: I0125 05:55:40.058974 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"34d58c30-a0dd-40da-94b2-ab3cba2038ad","Type":"ContainerStarted","Data":"968d08e182928a33e07ae42d655c1f8701f2bd504331eec0b8ede59814cb47dd"} Jan 25 05:55:40 crc kubenswrapper[4728]: I0125 05:55:40.059508 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:40 crc kubenswrapper[4728]: I0125 05:55:40.061528 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerStarted","Data":"f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3"} Jan 25 05:55:40 crc kubenswrapper[4728]: I0125 05:55:40.081300 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.081286541 podStartE2EDuration="2.081286541s" podCreationTimestamp="2026-01-25 05:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:55:40.078360814 +0000 UTC m=+1031.114238794" watchObservedRunningTime="2026-01-25 05:55:40.081286541 +0000 UTC m=+1031.117164521" Jan 25 05:55:42 crc kubenswrapper[4728]: I0125 05:55:42.079192 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerStarted","Data":"87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4"} Jan 25 05:55:42 crc kubenswrapper[4728]: I0125 05:55:42.079700 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 25 05:55:42 crc kubenswrapper[4728]: I0125 05:55:42.100574 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.728410605 podStartE2EDuration="7.100555384s" podCreationTimestamp="2026-01-25 05:55:35 +0000 UTC" firstStartedPulling="2026-01-25 05:55:35.810707732 +0000 UTC m=+1026.846585722" lastFinishedPulling="2026-01-25 05:55:41.182852521 +0000 UTC m=+1032.218730501" observedRunningTime="2026-01-25 05:55:42.093642067 +0000 UTC m=+1033.129520047" watchObservedRunningTime="2026-01-25 05:55:42.100555384 +0000 UTC m=+1033.136433364" Jan 25 05:55:42 crc kubenswrapper[4728]: I0125 05:55:42.902588 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:55:42 crc kubenswrapper[4728]: I0125 05:55:42.903298 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.451379 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.930769 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bxr5x"] Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.937920 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.941426 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.941632 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.956184 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-scripts\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.956494 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k76r\" (UniqueName: \"kubernetes.io/projected/e3928592-b152-41a4-a787-6f723fdb1839-kube-api-access-5k76r\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.956627 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-config-data\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.956745 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:48 crc kubenswrapper[4728]: I0125 05:55:48.965995 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bxr5x"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.059273 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.059370 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-scripts\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.059607 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k76r\" (UniqueName: \"kubernetes.io/projected/e3928592-b152-41a4-a787-6f723fdb1839-kube-api-access-5k76r\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.059723 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-config-data\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.065921 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-config-data\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.069227 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-scripts\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.079878 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k76r\" (UniqueName: \"kubernetes.io/projected/e3928592-b152-41a4-a787-6f723fdb1839-kube-api-access-5k76r\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.084884 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bxr5x\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.132936 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.134280 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.139090 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.153569 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.218864 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.229379 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.229487 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.234738 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.253777 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.263991 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/222860d6-29a5-481d-abc4-b1a36114e3ca-logs\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.264226 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.264286 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmsc9\" (UniqueName: \"kubernetes.io/projected/222860d6-29a5-481d-abc4-b1a36114e3ca-kube-api-access-lmsc9\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.265456 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-config-data\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.285140 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.286903 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.293983 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.316431 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.352096 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64fdb9bc75-sh8qj"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.355903 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64fdb9bc75-sh8qj"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.355999 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.368753 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpf79\" (UniqueName: \"kubernetes.io/projected/6b9f35ef-850d-4957-aa3e-b3be97fc945f-kube-api-access-zpf79\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.368829 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-config-data\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.368912 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/222860d6-29a5-481d-abc4-b1a36114e3ca-logs\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.369098 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.369121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.369162 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmsc9\" (UniqueName: \"kubernetes.io/projected/222860d6-29a5-481d-abc4-b1a36114e3ca-kube-api-access-lmsc9\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.369195 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-config-data\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.372371 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/222860d6-29a5-481d-abc4-b1a36114e3ca-logs\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.375932 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-config-data\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.383400 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.385242 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.385441 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.399780 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.424369 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.431571 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmsc9\" (UniqueName: \"kubernetes.io/projected/222860d6-29a5-481d-abc4-b1a36114e3ca-kube-api-access-lmsc9\") pod \"nova-metadata-0\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.451108 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473137 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-sb\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473295 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463497c7-9458-496e-ab52-19afdd182da4-logs\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473436 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-config\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473503 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-swift-storage-0\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473527 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473579 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-nb\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473621 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-svc\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473641 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-config-data\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473730 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpf79\" (UniqueName: \"kubernetes.io/projected/6b9f35ef-850d-4957-aa3e-b3be97fc945f-kube-api-access-zpf79\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473784 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2qjm\" (UniqueName: \"kubernetes.io/projected/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-kube-api-access-f2qjm\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473840 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-config-data\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473880 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.473894 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjq5c\" (UniqueName: \"kubernetes.io/projected/463497c7-9458-496e-ab52-19afdd182da4-kube-api-access-gjq5c\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.528956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.529391 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpf79\" (UniqueName: \"kubernetes.io/projected/6b9f35ef-850d-4957-aa3e-b3be97fc945f-kube-api-access-zpf79\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.533969 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-config-data\") pod \"nova-scheduler-0\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.546805 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579434 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463497c7-9458-496e-ab52-19afdd182da4-logs\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579468 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-config\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579492 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-swift-storage-0\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579513 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-nb\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579537 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-svc\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579567 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tb2k\" (UniqueName: \"kubernetes.io/projected/018525d6-89b2-4f6f-8833-c60a2e82aa86-kube-api-access-4tb2k\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579598 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2qjm\" (UniqueName: \"kubernetes.io/projected/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-kube-api-access-f2qjm\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579625 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-config-data\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579645 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579657 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjq5c\" (UniqueName: \"kubernetes.io/projected/463497c7-9458-496e-ab52-19afdd182da4-kube-api-access-gjq5c\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579690 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.579709 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-sb\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.601381 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-swift-storage-0\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.601682 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463497c7-9458-496e-ab52-19afdd182da4-logs\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.603235 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-nb\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.605438 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-svc\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.605784 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-sb\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.613674 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-config\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.615002 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.623952 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2qjm\" (UniqueName: \"kubernetes.io/projected/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-kube-api-access-f2qjm\") pod \"dnsmasq-dns-64fdb9bc75-sh8qj\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.641491 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjq5c\" (UniqueName: \"kubernetes.io/projected/463497c7-9458-496e-ab52-19afdd182da4-kube-api-access-gjq5c\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.642882 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-config-data\") pod \"nova-api-0\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.685198 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.685307 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tb2k\" (UniqueName: \"kubernetes.io/projected/018525d6-89b2-4f6f-8833-c60a2e82aa86-kube-api-access-4tb2k\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.685383 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.687759 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.689869 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.697246 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.709839 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tb2k\" (UniqueName: \"kubernetes.io/projected/018525d6-89b2-4f6f-8833-c60a2e82aa86-kube-api-access-4tb2k\") pod \"nova-cell1-novncproxy-0\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.749557 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.770908 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:49 crc kubenswrapper[4728]: I0125 05:55:49.977241 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bxr5x"] Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.125654 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.147554 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bxr5x" event={"ID":"e3928592-b152-41a4-a787-6f723fdb1839","Type":"ContainerStarted","Data":"2917df753bfef958c626401a04fc9fba8e8d50007d767ac2ba5de7fd919d2b9d"} Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.148461 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"222860d6-29a5-481d-abc4-b1a36114e3ca","Type":"ContainerStarted","Data":"744c0bf24a6d065df99697d3c0a3f320b46f2365e92fcaefaa0574db09c2bdd3"} Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.222210 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.292782 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:55:50 crc kubenswrapper[4728]: W0125 05:55:50.293103 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod463497c7_9458_496e_ab52_19afdd182da4.slice/crio-f44b7d8660f6ff53d987ff9e644f2888e1765e66dd7ccf4b2eee91d3eed22cad WatchSource:0}: Error finding container f44b7d8660f6ff53d987ff9e644f2888e1765e66dd7ccf4b2eee91d3eed22cad: Status 404 returned error can't find the container with id f44b7d8660f6ff53d987ff9e644f2888e1765e66dd7ccf4b2eee91d3eed22cad Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.371059 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64fdb9bc75-sh8qj"] Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.382044 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.461073 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tpsmt"] Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.462492 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.465532 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.466126 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.478369 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tpsmt"] Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.627005 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-config-data\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.627076 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77zlv\" (UniqueName: \"kubernetes.io/projected/4211ead7-9238-4898-a53a-ba17b0495bb3-kube-api-access-77zlv\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.627279 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.627502 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-scripts\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.730050 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77zlv\" (UniqueName: \"kubernetes.io/projected/4211ead7-9238-4898-a53a-ba17b0495bb3-kube-api-access-77zlv\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.730591 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.730892 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-scripts\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.731046 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-config-data\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.735452 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-scripts\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.735601 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-config-data\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.736886 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.744712 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77zlv\" (UniqueName: \"kubernetes.io/projected/4211ead7-9238-4898-a53a-ba17b0495bb3-kube-api-access-77zlv\") pod \"nova-cell1-conductor-db-sync-tpsmt\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:50 crc kubenswrapper[4728]: I0125 05:55:50.793776 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.162792 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"018525d6-89b2-4f6f-8833-c60a2e82aa86","Type":"ContainerStarted","Data":"4afbd8aa8bc85c7476a5dccae4f44f56b9f397be1e434eff59225db6f4f25f10"} Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.164850 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bxr5x" event={"ID":"e3928592-b152-41a4-a787-6f723fdb1839","Type":"ContainerStarted","Data":"1310718fe65447406c8e15ada9ffccd88ed8edced5e206ef84f6b272958be2fc"} Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.165838 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"463497c7-9458-496e-ab52-19afdd182da4","Type":"ContainerStarted","Data":"f44b7d8660f6ff53d987ff9e644f2888e1765e66dd7ccf4b2eee91d3eed22cad"} Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.168078 4728 generic.go:334] "Generic (PLEG): container finished" podID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" containerID="7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987" exitCode=0 Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.168164 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" event={"ID":"33e5b2d8-3161-4b8e-b40a-64b3c6c09138","Type":"ContainerDied","Data":"7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987"} Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.168197 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" event={"ID":"33e5b2d8-3161-4b8e-b40a-64b3c6c09138","Type":"ContainerStarted","Data":"9d9ebc4afe57d7a198389504197e49c8971611eebaad7a3f147ce8fa4a786016"} Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.171336 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b9f35ef-850d-4957-aa3e-b3be97fc945f","Type":"ContainerStarted","Data":"190bae129c3f25f579688dd82e1ed273de90adcfce2a1521638080ac304be37f"} Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.189119 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bxr5x" podStartSLOduration=3.189102437 podStartE2EDuration="3.189102437s" podCreationTimestamp="2026-01-25 05:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:55:51.179062224 +0000 UTC m=+1042.214940204" watchObservedRunningTime="2026-01-25 05:55:51.189102437 +0000 UTC m=+1042.224980417" Jan 25 05:55:51 crc kubenswrapper[4728]: I0125 05:55:51.223451 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tpsmt"] Jan 25 05:55:52 crc kubenswrapper[4728]: I0125 05:55:52.212956 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" event={"ID":"4211ead7-9238-4898-a53a-ba17b0495bb3","Type":"ContainerStarted","Data":"4fd745d14345d24411a0e9bb2f13fb4238c51c27ced5dda05b89271297428522"} Jan 25 05:55:52 crc kubenswrapper[4728]: I0125 05:55:52.213255 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" event={"ID":"4211ead7-9238-4898-a53a-ba17b0495bb3","Type":"ContainerStarted","Data":"1f5e5d711a7133cec30780ee041b3ea2a906297c95bbb85a5564781ca04775d5"} Jan 25 05:55:52 crc kubenswrapper[4728]: I0125 05:55:52.253848 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" podStartSLOduration=2.253829939 podStartE2EDuration="2.253829939s" podCreationTimestamp="2026-01-25 05:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:55:52.250946121 +0000 UTC m=+1043.286824100" watchObservedRunningTime="2026-01-25 05:55:52.253829939 +0000 UTC m=+1043.289707919" Jan 25 05:55:52 crc kubenswrapper[4728]: I0125 05:55:52.254119 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" event={"ID":"33e5b2d8-3161-4b8e-b40a-64b3c6c09138","Type":"ContainerStarted","Data":"00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae"} Jan 25 05:55:52 crc kubenswrapper[4728]: I0125 05:55:52.257708 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:52 crc kubenswrapper[4728]: I0125 05:55:52.941925 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" podStartSLOduration=3.941908576 podStartE2EDuration="3.941908576s" podCreationTimestamp="2026-01-25 05:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:55:52.289899468 +0000 UTC m=+1043.325777448" watchObservedRunningTime="2026-01-25 05:55:52.941908576 +0000 UTC m=+1043.977786556" Jan 25 05:55:52 crc kubenswrapper[4728]: I0125 05:55:52.943116 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:52 crc kubenswrapper[4728]: I0125 05:55:52.993338 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.277852 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"018525d6-89b2-4f6f-8833-c60a2e82aa86","Type":"ContainerStarted","Data":"10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a"} Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.278008 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="018525d6-89b2-4f6f-8833-c60a2e82aa86" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a" gracePeriod=30 Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.282847 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"222860d6-29a5-481d-abc4-b1a36114e3ca","Type":"ContainerStarted","Data":"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a"} Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.282894 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"222860d6-29a5-481d-abc4-b1a36114e3ca","Type":"ContainerStarted","Data":"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64"} Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.283062 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerName="nova-metadata-log" containerID="cri-o://07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64" gracePeriod=30 Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.283084 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerName="nova-metadata-metadata" containerID="cri-o://19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a" gracePeriod=30 Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.290199 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"463497c7-9458-496e-ab52-19afdd182da4","Type":"ContainerStarted","Data":"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa"} Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.290582 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"463497c7-9458-496e-ab52-19afdd182da4","Type":"ContainerStarted","Data":"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d"} Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.291635 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b9f35ef-850d-4957-aa3e-b3be97fc945f","Type":"ContainerStarted","Data":"150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae"} Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.295958 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.438482629 podStartE2EDuration="5.295947823s" podCreationTimestamp="2026-01-25 05:55:49 +0000 UTC" firstStartedPulling="2026-01-25 05:55:50.383137451 +0000 UTC m=+1041.419015430" lastFinishedPulling="2026-01-25 05:55:53.240602644 +0000 UTC m=+1044.276480624" observedRunningTime="2026-01-25 05:55:54.290004124 +0000 UTC m=+1045.325882104" watchObservedRunningTime="2026-01-25 05:55:54.295947823 +0000 UTC m=+1045.331825802" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.315094 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.200235583 podStartE2EDuration="5.315085251s" podCreationTimestamp="2026-01-25 05:55:49 +0000 UTC" firstStartedPulling="2026-01-25 05:55:50.125807599 +0000 UTC m=+1041.161685579" lastFinishedPulling="2026-01-25 05:55:53.240657266 +0000 UTC m=+1044.276535247" observedRunningTime="2026-01-25 05:55:54.312587381 +0000 UTC m=+1045.348465361" watchObservedRunningTime="2026-01-25 05:55:54.315085251 +0000 UTC m=+1045.350963232" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.327971 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.320973841 podStartE2EDuration="5.327963357s" podCreationTimestamp="2026-01-25 05:55:49 +0000 UTC" firstStartedPulling="2026-01-25 05:55:50.230234386 +0000 UTC m=+1041.266112366" lastFinishedPulling="2026-01-25 05:55:53.237223902 +0000 UTC m=+1044.273101882" observedRunningTime="2026-01-25 05:55:54.32492153 +0000 UTC m=+1045.360799510" watchObservedRunningTime="2026-01-25 05:55:54.327963357 +0000 UTC m=+1045.363841337" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.341456 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.390891 podStartE2EDuration="5.341440943s" podCreationTimestamp="2026-01-25 05:55:49 +0000 UTC" firstStartedPulling="2026-01-25 05:55:50.296844349 +0000 UTC m=+1041.332722329" lastFinishedPulling="2026-01-25 05:55:53.247394291 +0000 UTC m=+1044.283272272" observedRunningTime="2026-01-25 05:55:54.337312377 +0000 UTC m=+1045.373190357" watchObservedRunningTime="2026-01-25 05:55:54.341440943 +0000 UTC m=+1045.377318924" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.451402 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.451478 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.548243 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.771839 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.807257 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.928457 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmsc9\" (UniqueName: \"kubernetes.io/projected/222860d6-29a5-481d-abc4-b1a36114e3ca-kube-api-access-lmsc9\") pod \"222860d6-29a5-481d-abc4-b1a36114e3ca\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.928630 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/222860d6-29a5-481d-abc4-b1a36114e3ca-logs\") pod \"222860d6-29a5-481d-abc4-b1a36114e3ca\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.928735 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-combined-ca-bundle\") pod \"222860d6-29a5-481d-abc4-b1a36114e3ca\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.928789 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-config-data\") pod \"222860d6-29a5-481d-abc4-b1a36114e3ca\" (UID: \"222860d6-29a5-481d-abc4-b1a36114e3ca\") " Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.929443 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/222860d6-29a5-481d-abc4-b1a36114e3ca-logs" (OuterVolumeSpecName: "logs") pod "222860d6-29a5-481d-abc4-b1a36114e3ca" (UID: "222860d6-29a5-481d-abc4-b1a36114e3ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.934099 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222860d6-29a5-481d-abc4-b1a36114e3ca-kube-api-access-lmsc9" (OuterVolumeSpecName: "kube-api-access-lmsc9") pod "222860d6-29a5-481d-abc4-b1a36114e3ca" (UID: "222860d6-29a5-481d-abc4-b1a36114e3ca"). InnerVolumeSpecName "kube-api-access-lmsc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.959901 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "222860d6-29a5-481d-abc4-b1a36114e3ca" (UID: "222860d6-29a5-481d-abc4-b1a36114e3ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:54 crc kubenswrapper[4728]: I0125 05:55:54.960463 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-config-data" (OuterVolumeSpecName: "config-data") pod "222860d6-29a5-481d-abc4-b1a36114e3ca" (UID: "222860d6-29a5-481d-abc4-b1a36114e3ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.032154 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.032195 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222860d6-29a5-481d-abc4-b1a36114e3ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.032207 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmsc9\" (UniqueName: \"kubernetes.io/projected/222860d6-29a5-481d-abc4-b1a36114e3ca-kube-api-access-lmsc9\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.032217 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/222860d6-29a5-481d-abc4-b1a36114e3ca-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.301719 4728 generic.go:334] "Generic (PLEG): container finished" podID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerID="19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a" exitCode=0 Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.301752 4728 generic.go:334] "Generic (PLEG): container finished" podID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerID="07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64" exitCode=143 Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.301856 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"222860d6-29a5-481d-abc4-b1a36114e3ca","Type":"ContainerDied","Data":"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a"} Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.301889 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.301926 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"222860d6-29a5-481d-abc4-b1a36114e3ca","Type":"ContainerDied","Data":"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64"} Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.301941 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"222860d6-29a5-481d-abc4-b1a36114e3ca","Type":"ContainerDied","Data":"744c0bf24a6d065df99697d3c0a3f320b46f2365e92fcaefaa0574db09c2bdd3"} Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.301961 4728 scope.go:117] "RemoveContainer" containerID="19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.326436 4728 scope.go:117] "RemoveContainer" containerID="07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.347040 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.347079 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.351881 4728 scope.go:117] "RemoveContainer" containerID="19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a" Jan 25 05:55:55 crc kubenswrapper[4728]: E0125 05:55:55.355728 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a\": container with ID starting with 19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a not found: ID does not exist" containerID="19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.355812 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a"} err="failed to get container status \"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a\": rpc error: code = NotFound desc = could not find container \"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a\": container with ID starting with 19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a not found: ID does not exist" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.355856 4728 scope.go:117] "RemoveContainer" containerID="07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64" Jan 25 05:55:55 crc kubenswrapper[4728]: E0125 05:55:55.359717 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64\": container with ID starting with 07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64 not found: ID does not exist" containerID="07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.359809 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64"} err="failed to get container status \"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64\": rpc error: code = NotFound desc = could not find container \"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64\": container with ID starting with 07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64 not found: ID does not exist" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.359847 4728 scope.go:117] "RemoveContainer" containerID="19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.362693 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a"} err="failed to get container status \"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a\": rpc error: code = NotFound desc = could not find container \"19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a\": container with ID starting with 19d18a8e8b0e285cf599c7c60e3293e8be87ee8765e1217161040b384fec817a not found: ID does not exist" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.362714 4728 scope.go:117] "RemoveContainer" containerID="07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.363003 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64"} err="failed to get container status \"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64\": rpc error: code = NotFound desc = could not find container \"07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64\": container with ID starting with 07c2f0ea8b6fa43d51b0aabdbe75fc0920760e52fe79748180c86b5ec28f1d64 not found: ID does not exist" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.382566 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:55 crc kubenswrapper[4728]: E0125 05:55:55.383367 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerName="nova-metadata-metadata" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.383801 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerName="nova-metadata-metadata" Jan 25 05:55:55 crc kubenswrapper[4728]: E0125 05:55:55.383838 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerName="nova-metadata-log" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.383846 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerName="nova-metadata-log" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.384388 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerName="nova-metadata-log" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.384437 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" containerName="nova-metadata-metadata" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.386845 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.390437 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.392660 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.392860 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.544950 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40634a2e-4d94-4f12-a5ab-c254f803bf16-logs\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.545058 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjx5\" (UniqueName: \"kubernetes.io/projected/40634a2e-4d94-4f12-a5ab-c254f803bf16-kube-api-access-pzjx5\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.545415 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-config-data\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.545717 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.545822 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.647163 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.647202 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.647242 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40634a2e-4d94-4f12-a5ab-c254f803bf16-logs\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.647302 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjx5\" (UniqueName: \"kubernetes.io/projected/40634a2e-4d94-4f12-a5ab-c254f803bf16-kube-api-access-pzjx5\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.647371 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-config-data\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.648248 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40634a2e-4d94-4f12-a5ab-c254f803bf16-logs\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.651561 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.657403 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-config-data\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.660084 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.660405 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjx5\" (UniqueName: \"kubernetes.io/projected/40634a2e-4d94-4f12-a5ab-c254f803bf16-kube-api-access-pzjx5\") pod \"nova-metadata-0\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " pod="openstack/nova-metadata-0" Jan 25 05:55:55 crc kubenswrapper[4728]: I0125 05:55:55.705964 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:55:56 crc kubenswrapper[4728]: I0125 05:55:56.095714 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:56 crc kubenswrapper[4728]: W0125 05:55:56.102463 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40634a2e_4d94_4f12_a5ab_c254f803bf16.slice/crio-a7279a6abb3a1611477bc0ba3c0e84417080a4d5427a7a946b5b69e3676f3257 WatchSource:0}: Error finding container a7279a6abb3a1611477bc0ba3c0e84417080a4d5427a7a946b5b69e3676f3257: Status 404 returned error can't find the container with id a7279a6abb3a1611477bc0ba3c0e84417080a4d5427a7a946b5b69e3676f3257 Jan 25 05:55:56 crc kubenswrapper[4728]: I0125 05:55:56.315368 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40634a2e-4d94-4f12-a5ab-c254f803bf16","Type":"ContainerStarted","Data":"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a"} Jan 25 05:55:56 crc kubenswrapper[4728]: I0125 05:55:56.315611 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40634a2e-4d94-4f12-a5ab-c254f803bf16","Type":"ContainerStarted","Data":"a7279a6abb3a1611477bc0ba3c0e84417080a4d5427a7a946b5b69e3676f3257"} Jan 25 05:55:56 crc kubenswrapper[4728]: I0125 05:55:56.317831 4728 generic.go:334] "Generic (PLEG): container finished" podID="4211ead7-9238-4898-a53a-ba17b0495bb3" containerID="4fd745d14345d24411a0e9bb2f13fb4238c51c27ced5dda05b89271297428522" exitCode=0 Jan 25 05:55:56 crc kubenswrapper[4728]: I0125 05:55:56.317872 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" event={"ID":"4211ead7-9238-4898-a53a-ba17b0495bb3","Type":"ContainerDied","Data":"4fd745d14345d24411a0e9bb2f13fb4238c51c27ced5dda05b89271297428522"} Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.332899 4728 generic.go:334] "Generic (PLEG): container finished" podID="e3928592-b152-41a4-a787-6f723fdb1839" containerID="1310718fe65447406c8e15ada9ffccd88ed8edced5e206ef84f6b272958be2fc" exitCode=0 Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.338539 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="222860d6-29a5-481d-abc4-b1a36114e3ca" path="/var/lib/kubelet/pods/222860d6-29a5-481d-abc4-b1a36114e3ca/volumes" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.339276 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bxr5x" event={"ID":"e3928592-b152-41a4-a787-6f723fdb1839","Type":"ContainerDied","Data":"1310718fe65447406c8e15ada9ffccd88ed8edced5e206ef84f6b272958be2fc"} Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.339315 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40634a2e-4d94-4f12-a5ab-c254f803bf16","Type":"ContainerStarted","Data":"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d"} Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.367374 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.367358032 podStartE2EDuration="2.367358032s" podCreationTimestamp="2026-01-25 05:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:55:57.351734197 +0000 UTC m=+1048.387612178" watchObservedRunningTime="2026-01-25 05:55:57.367358032 +0000 UTC m=+1048.403236012" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.649031 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.788195 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77zlv\" (UniqueName: \"kubernetes.io/projected/4211ead7-9238-4898-a53a-ba17b0495bb3-kube-api-access-77zlv\") pod \"4211ead7-9238-4898-a53a-ba17b0495bb3\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.788359 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-config-data\") pod \"4211ead7-9238-4898-a53a-ba17b0495bb3\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.788607 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-scripts\") pod \"4211ead7-9238-4898-a53a-ba17b0495bb3\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.788853 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-combined-ca-bundle\") pod \"4211ead7-9238-4898-a53a-ba17b0495bb3\" (UID: \"4211ead7-9238-4898-a53a-ba17b0495bb3\") " Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.800903 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4211ead7-9238-4898-a53a-ba17b0495bb3-kube-api-access-77zlv" (OuterVolumeSpecName: "kube-api-access-77zlv") pod "4211ead7-9238-4898-a53a-ba17b0495bb3" (UID: "4211ead7-9238-4898-a53a-ba17b0495bb3"). InnerVolumeSpecName "kube-api-access-77zlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.801799 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-scripts" (OuterVolumeSpecName: "scripts") pod "4211ead7-9238-4898-a53a-ba17b0495bb3" (UID: "4211ead7-9238-4898-a53a-ba17b0495bb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.813559 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4211ead7-9238-4898-a53a-ba17b0495bb3" (UID: "4211ead7-9238-4898-a53a-ba17b0495bb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.816701 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-config-data" (OuterVolumeSpecName: "config-data") pod "4211ead7-9238-4898-a53a-ba17b0495bb3" (UID: "4211ead7-9238-4898-a53a-ba17b0495bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.893375 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77zlv\" (UniqueName: \"kubernetes.io/projected/4211ead7-9238-4898-a53a-ba17b0495bb3-kube-api-access-77zlv\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.893408 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.893419 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:57 crc kubenswrapper[4728]: I0125 05:55:57.893429 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4211ead7-9238-4898-a53a-ba17b0495bb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.348881 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" event={"ID":"4211ead7-9238-4898-a53a-ba17b0495bb3","Type":"ContainerDied","Data":"1f5e5d711a7133cec30780ee041b3ea2a906297c95bbb85a5564781ca04775d5"} Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.348932 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f5e5d711a7133cec30780ee041b3ea2a906297c95bbb85a5564781ca04775d5" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.348998 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tpsmt" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.402934 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 25 05:55:58 crc kubenswrapper[4728]: E0125 05:55:58.403355 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4211ead7-9238-4898-a53a-ba17b0495bb3" containerName="nova-cell1-conductor-db-sync" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.403374 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4211ead7-9238-4898-a53a-ba17b0495bb3" containerName="nova-cell1-conductor-db-sync" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.403573 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4211ead7-9238-4898-a53a-ba17b0495bb3" containerName="nova-cell1-conductor-db-sync" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.404127 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.409368 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.412272 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.503628 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae38c3f-b058-49a8-8df8-5222dc364151-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.503902 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvkn\" (UniqueName: \"kubernetes.io/projected/8ae38c3f-b058-49a8-8df8-5222dc364151-kube-api-access-rxvkn\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.504020 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae38c3f-b058-49a8-8df8-5222dc364151-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.607033 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae38c3f-b058-49a8-8df8-5222dc364151-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.607126 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvkn\" (UniqueName: \"kubernetes.io/projected/8ae38c3f-b058-49a8-8df8-5222dc364151-kube-api-access-rxvkn\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.607171 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae38c3f-b058-49a8-8df8-5222dc364151-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.612173 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae38c3f-b058-49a8-8df8-5222dc364151-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.612245 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae38c3f-b058-49a8-8df8-5222dc364151-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.622802 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvkn\" (UniqueName: \"kubernetes.io/projected/8ae38c3f-b058-49a8-8df8-5222dc364151-kube-api-access-rxvkn\") pod \"nova-cell1-conductor-0\" (UID: \"8ae38c3f-b058-49a8-8df8-5222dc364151\") " pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.698305 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.719753 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.814567 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-combined-ca-bundle\") pod \"e3928592-b152-41a4-a787-6f723fdb1839\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.814692 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-config-data\") pod \"e3928592-b152-41a4-a787-6f723fdb1839\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.814734 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-scripts\") pod \"e3928592-b152-41a4-a787-6f723fdb1839\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.814949 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k76r\" (UniqueName: \"kubernetes.io/projected/e3928592-b152-41a4-a787-6f723fdb1839-kube-api-access-5k76r\") pod \"e3928592-b152-41a4-a787-6f723fdb1839\" (UID: \"e3928592-b152-41a4-a787-6f723fdb1839\") " Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.828816 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3928592-b152-41a4-a787-6f723fdb1839-kube-api-access-5k76r" (OuterVolumeSpecName: "kube-api-access-5k76r") pod "e3928592-b152-41a4-a787-6f723fdb1839" (UID: "e3928592-b152-41a4-a787-6f723fdb1839"). InnerVolumeSpecName "kube-api-access-5k76r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.834403 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-scripts" (OuterVolumeSpecName: "scripts") pod "e3928592-b152-41a4-a787-6f723fdb1839" (UID: "e3928592-b152-41a4-a787-6f723fdb1839"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.861577 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3928592-b152-41a4-a787-6f723fdb1839" (UID: "e3928592-b152-41a4-a787-6f723fdb1839"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.894139 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-config-data" (OuterVolumeSpecName: "config-data") pod "e3928592-b152-41a4-a787-6f723fdb1839" (UID: "e3928592-b152-41a4-a787-6f723fdb1839"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.917876 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k76r\" (UniqueName: \"kubernetes.io/projected/e3928592-b152-41a4-a787-6f723fdb1839-kube-api-access-5k76r\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.917912 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.917924 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:58 crc kubenswrapper[4728]: I0125 05:55:58.917935 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3928592-b152-41a4-a787-6f723fdb1839-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:55:59 crc kubenswrapper[4728]: W0125 05:55:59.242605 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae38c3f_b058_49a8_8df8_5222dc364151.slice/crio-e7a0d1820f2e95dc1493467749fc11808de359d0ffe30e401a81d1d106bc2a7d WatchSource:0}: Error finding container e7a0d1820f2e95dc1493467749fc11808de359d0ffe30e401a81d1d106bc2a7d: Status 404 returned error can't find the container with id e7a0d1820f2e95dc1493467749fc11808de359d0ffe30e401a81d1d106bc2a7d Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.243106 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.358814 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8ae38c3f-b058-49a8-8df8-5222dc364151","Type":"ContainerStarted","Data":"e7a0d1820f2e95dc1493467749fc11808de359d0ffe30e401a81d1d106bc2a7d"} Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.360868 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bxr5x" event={"ID":"e3928592-b152-41a4-a787-6f723fdb1839","Type":"ContainerDied","Data":"2917df753bfef958c626401a04fc9fba8e8d50007d767ac2ba5de7fd919d2b9d"} Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.360929 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bxr5x" Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.360954 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2917df753bfef958c626401a04fc9fba8e8d50007d767ac2ba5de7fd919d2b9d" Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.541789 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.542007 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6b9f35ef-850d-4957-aa3e-b3be97fc945f" containerName="nova-scheduler-scheduler" containerID="cri-o://150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae" gracePeriod=30 Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.550189 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.550401 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="463497c7-9458-496e-ab52-19afdd182da4" containerName="nova-api-log" containerID="cri-o://37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d" gracePeriod=30 Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.550453 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="463497c7-9458-496e-ab52-19afdd182da4" containerName="nova-api-api" containerID="cri-o://39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa" gracePeriod=30 Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.556158 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.556423 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerName="nova-metadata-log" containerID="cri-o://244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a" gracePeriod=30 Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.556608 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerName="nova-metadata-metadata" containerID="cri-o://bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d" gracePeriod=30 Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.751250 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.818531 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb49f8487-rfk6x"] Jan 25 05:55:59 crc kubenswrapper[4728]: I0125 05:55:59.818778 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" podUID="c05bf129-98d0-4259-b8f8-f86b1e68b084" containerName="dnsmasq-dns" containerID="cri-o://73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc" gracePeriod=10 Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.106228 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.112388 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.248652 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.259377 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-config-data\") pod \"40634a2e-4d94-4f12-a5ab-c254f803bf16\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.259441 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjq5c\" (UniqueName: \"kubernetes.io/projected/463497c7-9458-496e-ab52-19afdd182da4-kube-api-access-gjq5c\") pod \"463497c7-9458-496e-ab52-19afdd182da4\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.259471 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40634a2e-4d94-4f12-a5ab-c254f803bf16-logs\") pod \"40634a2e-4d94-4f12-a5ab-c254f803bf16\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.259493 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-combined-ca-bundle\") pod \"40634a2e-4d94-4f12-a5ab-c254f803bf16\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.259513 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-combined-ca-bundle\") pod \"463497c7-9458-496e-ab52-19afdd182da4\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.259630 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-config-data\") pod \"463497c7-9458-496e-ab52-19afdd182da4\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.259746 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-nova-metadata-tls-certs\") pod \"40634a2e-4d94-4f12-a5ab-c254f803bf16\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.259806 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463497c7-9458-496e-ab52-19afdd182da4-logs\") pod \"463497c7-9458-496e-ab52-19afdd182da4\" (UID: \"463497c7-9458-496e-ab52-19afdd182da4\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.260035 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzjx5\" (UniqueName: \"kubernetes.io/projected/40634a2e-4d94-4f12-a5ab-c254f803bf16-kube-api-access-pzjx5\") pod \"40634a2e-4d94-4f12-a5ab-c254f803bf16\" (UID: \"40634a2e-4d94-4f12-a5ab-c254f803bf16\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.260286 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40634a2e-4d94-4f12-a5ab-c254f803bf16-logs" (OuterVolumeSpecName: "logs") pod "40634a2e-4d94-4f12-a5ab-c254f803bf16" (UID: "40634a2e-4d94-4f12-a5ab-c254f803bf16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.260351 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463497c7-9458-496e-ab52-19afdd182da4-logs" (OuterVolumeSpecName: "logs") pod "463497c7-9458-496e-ab52-19afdd182da4" (UID: "463497c7-9458-496e-ab52-19afdd182da4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.260828 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463497c7-9458-496e-ab52-19afdd182da4-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.260847 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40634a2e-4d94-4f12-a5ab-c254f803bf16-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.268547 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463497c7-9458-496e-ab52-19afdd182da4-kube-api-access-gjq5c" (OuterVolumeSpecName: "kube-api-access-gjq5c") pod "463497c7-9458-496e-ab52-19afdd182da4" (UID: "463497c7-9458-496e-ab52-19afdd182da4"). InnerVolumeSpecName "kube-api-access-gjq5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.282115 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40634a2e-4d94-4f12-a5ab-c254f803bf16-kube-api-access-pzjx5" (OuterVolumeSpecName: "kube-api-access-pzjx5") pod "40634a2e-4d94-4f12-a5ab-c254f803bf16" (UID: "40634a2e-4d94-4f12-a5ab-c254f803bf16"). InnerVolumeSpecName "kube-api-access-pzjx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.296546 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40634a2e-4d94-4f12-a5ab-c254f803bf16" (UID: "40634a2e-4d94-4f12-a5ab-c254f803bf16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.301129 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "463497c7-9458-496e-ab52-19afdd182da4" (UID: "463497c7-9458-496e-ab52-19afdd182da4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.304672 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-config-data" (OuterVolumeSpecName: "config-data") pod "463497c7-9458-496e-ab52-19afdd182da4" (UID: "463497c7-9458-496e-ab52-19afdd182da4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.306440 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-config-data" (OuterVolumeSpecName: "config-data") pod "40634a2e-4d94-4f12-a5ab-c254f803bf16" (UID: "40634a2e-4d94-4f12-a5ab-c254f803bf16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.322097 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "40634a2e-4d94-4f12-a5ab-c254f803bf16" (UID: "40634a2e-4d94-4f12-a5ab-c254f803bf16"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.361942 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf2rw\" (UniqueName: \"kubernetes.io/projected/c05bf129-98d0-4259-b8f8-f86b1e68b084-kube-api-access-rf2rw\") pod \"c05bf129-98d0-4259-b8f8-f86b1e68b084\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.362427 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-swift-storage-0\") pod \"c05bf129-98d0-4259-b8f8-f86b1e68b084\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.362567 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-svc\") pod \"c05bf129-98d0-4259-b8f8-f86b1e68b084\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.362655 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-config\") pod \"c05bf129-98d0-4259-b8f8-f86b1e68b084\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.362747 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-nb\") pod \"c05bf129-98d0-4259-b8f8-f86b1e68b084\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.362802 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-sb\") pod \"c05bf129-98d0-4259-b8f8-f86b1e68b084\" (UID: \"c05bf129-98d0-4259-b8f8-f86b1e68b084\") " Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.363348 4728 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.363367 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzjx5\" (UniqueName: \"kubernetes.io/projected/40634a2e-4d94-4f12-a5ab-c254f803bf16-kube-api-access-pzjx5\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.363377 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.363386 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjq5c\" (UniqueName: \"kubernetes.io/projected/463497c7-9458-496e-ab52-19afdd182da4-kube-api-access-gjq5c\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.363396 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40634a2e-4d94-4f12-a5ab-c254f803bf16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.363424 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.363431 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463497c7-9458-496e-ab52-19afdd182da4-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.368377 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05bf129-98d0-4259-b8f8-f86b1e68b084-kube-api-access-rf2rw" (OuterVolumeSpecName: "kube-api-access-rf2rw") pod "c05bf129-98d0-4259-b8f8-f86b1e68b084" (UID: "c05bf129-98d0-4259-b8f8-f86b1e68b084"). InnerVolumeSpecName "kube-api-access-rf2rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.372785 4728 generic.go:334] "Generic (PLEG): container finished" podID="c05bf129-98d0-4259-b8f8-f86b1e68b084" containerID="73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc" exitCode=0 Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.372915 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.372941 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" event={"ID":"c05bf129-98d0-4259-b8f8-f86b1e68b084","Type":"ContainerDied","Data":"73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.373578 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb49f8487-rfk6x" event={"ID":"c05bf129-98d0-4259-b8f8-f86b1e68b084","Type":"ContainerDied","Data":"c867656b4c572559f4614fe953085d7839565c880416addb14bf5688db42f353"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.373633 4728 scope.go:117] "RemoveContainer" containerID="73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.379202 4728 generic.go:334] "Generic (PLEG): container finished" podID="463497c7-9458-496e-ab52-19afdd182da4" containerID="39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa" exitCode=0 Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.379244 4728 generic.go:334] "Generic (PLEG): container finished" podID="463497c7-9458-496e-ab52-19afdd182da4" containerID="37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d" exitCode=143 Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.379279 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"463497c7-9458-496e-ab52-19afdd182da4","Type":"ContainerDied","Data":"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.379297 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"463497c7-9458-496e-ab52-19afdd182da4","Type":"ContainerDied","Data":"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.379360 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"463497c7-9458-496e-ab52-19afdd182da4","Type":"ContainerDied","Data":"f44b7d8660f6ff53d987ff9e644f2888e1765e66dd7ccf4b2eee91d3eed22cad"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.379434 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.383233 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8ae38c3f-b058-49a8-8df8-5222dc364151","Type":"ContainerStarted","Data":"f6235ed4abb75e6ca08085ea7d91b8c522c023d8e5c6924d7ec7e970c0c7698e"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.383336 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.384985 4728 generic.go:334] "Generic (PLEG): container finished" podID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerID="bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d" exitCode=0 Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.385016 4728 generic.go:334] "Generic (PLEG): container finished" podID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerID="244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a" exitCode=143 Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.385032 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40634a2e-4d94-4f12-a5ab-c254f803bf16","Type":"ContainerDied","Data":"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.385050 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40634a2e-4d94-4f12-a5ab-c254f803bf16","Type":"ContainerDied","Data":"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.385066 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40634a2e-4d94-4f12-a5ab-c254f803bf16","Type":"ContainerDied","Data":"a7279a6abb3a1611477bc0ba3c0e84417080a4d5427a7a946b5b69e3676f3257"} Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.385114 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.408286 4728 scope.go:117] "RemoveContainer" containerID="17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.409369 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.40935333 podStartE2EDuration="2.40935333s" podCreationTimestamp="2026-01-25 05:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:00.400350121 +0000 UTC m=+1051.436228122" watchObservedRunningTime="2026-01-25 05:56:00.40935333 +0000 UTC m=+1051.445231310" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.413998 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c05bf129-98d0-4259-b8f8-f86b1e68b084" (UID: "c05bf129-98d0-4259-b8f8-f86b1e68b084"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.417753 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c05bf129-98d0-4259-b8f8-f86b1e68b084" (UID: "c05bf129-98d0-4259-b8f8-f86b1e68b084"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.418927 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-config" (OuterVolumeSpecName: "config") pod "c05bf129-98d0-4259-b8f8-f86b1e68b084" (UID: "c05bf129-98d0-4259-b8f8-f86b1e68b084"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.428931 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c05bf129-98d0-4259-b8f8-f86b1e68b084" (UID: "c05bf129-98d0-4259-b8f8-f86b1e68b084"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.473985 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.475367 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.475387 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.475397 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.475406 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.475415 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf2rw\" (UniqueName: \"kubernetes.io/projected/c05bf129-98d0-4259-b8f8-f86b1e68b084-kube-api-access-rf2rw\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.481272 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.483091 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c05bf129-98d0-4259-b8f8-f86b1e68b084" (UID: "c05bf129-98d0-4259-b8f8-f86b1e68b084"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.487422 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.492336 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.492387 4728 scope.go:117] "RemoveContainer" containerID="73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.492944 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc\": container with ID starting with 73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc not found: ID does not exist" containerID="73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.492984 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc"} err="failed to get container status \"73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc\": rpc error: code = NotFound desc = could not find container \"73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc\": container with ID starting with 73b871e555d4b7d76839c43b71e46e1ee12450379419f5bf959469000baa47cc not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.493019 4728 scope.go:117] "RemoveContainer" containerID="17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.493566 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3\": container with ID starting with 17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3 not found: ID does not exist" containerID="17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.493605 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3"} err="failed to get container status \"17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3\": rpc error: code = NotFound desc = could not find container \"17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3\": container with ID starting with 17b9203a5a4a8af66c47072cdfba66e35f590c7d60b18a2641e70e6c2b4bd6f3 not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.493636 4728 scope.go:117] "RemoveContainer" containerID="39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.499975 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.500349 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463497c7-9458-496e-ab52-19afdd182da4" containerName="nova-api-api" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500363 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="463497c7-9458-496e-ab52-19afdd182da4" containerName="nova-api-api" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.500377 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3928592-b152-41a4-a787-6f723fdb1839" containerName="nova-manage" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500383 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3928592-b152-41a4-a787-6f723fdb1839" containerName="nova-manage" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.500394 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerName="nova-metadata-metadata" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500401 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerName="nova-metadata-metadata" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.500419 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05bf129-98d0-4259-b8f8-f86b1e68b084" containerName="dnsmasq-dns" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500425 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05bf129-98d0-4259-b8f8-f86b1e68b084" containerName="dnsmasq-dns" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.500434 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05bf129-98d0-4259-b8f8-f86b1e68b084" containerName="init" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500439 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05bf129-98d0-4259-b8f8-f86b1e68b084" containerName="init" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.500453 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463497c7-9458-496e-ab52-19afdd182da4" containerName="nova-api-log" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500458 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="463497c7-9458-496e-ab52-19afdd182da4" containerName="nova-api-log" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.500470 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerName="nova-metadata-log" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500475 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerName="nova-metadata-log" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500634 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3928592-b152-41a4-a787-6f723fdb1839" containerName="nova-manage" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500648 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerName="nova-metadata-metadata" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500660 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="463497c7-9458-496e-ab52-19afdd182da4" containerName="nova-api-api" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500669 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" containerName="nova-metadata-log" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500677 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c05bf129-98d0-4259-b8f8-f86b1e68b084" containerName="dnsmasq-dns" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.500691 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="463497c7-9458-496e-ab52-19afdd182da4" containerName="nova-api-log" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.508107 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.514628 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.514894 4728 scope.go:117] "RemoveContainer" containerID="37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.518334 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.538344 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.540198 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.542107 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.542451 4728 scope.go:117] "RemoveContainer" containerID="39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.542648 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.544584 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.546912 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa\": container with ID starting with 39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa not found: ID does not exist" containerID="39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.546967 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa"} err="failed to get container status \"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa\": rpc error: code = NotFound desc = could not find container \"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa\": container with ID starting with 39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.546996 4728 scope.go:117] "RemoveContainer" containerID="37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.549182 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d\": container with ID starting with 37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d not found: ID does not exist" containerID="37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.549217 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d"} err="failed to get container status \"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d\": rpc error: code = NotFound desc = could not find container \"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d\": container with ID starting with 37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.549240 4728 scope.go:117] "RemoveContainer" containerID="39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.549445 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa"} err="failed to get container status \"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa\": rpc error: code = NotFound desc = could not find container \"39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa\": container with ID starting with 39d99a21d9aeb977e1f7f52dbab4c6286b7cdaf2edd56d0faeb95f72e001d3aa not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.549464 4728 scope.go:117] "RemoveContainer" containerID="37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.549978 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d"} err="failed to get container status \"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d\": rpc error: code = NotFound desc = could not find container \"37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d\": container with ID starting with 37321a3146ff5319a7db0d50e1571890c52e0e5079ba94d334dee2893e1d518d not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.550016 4728 scope.go:117] "RemoveContainer" containerID="bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.578631 4728 scope.go:117] "RemoveContainer" containerID="244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.579611 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c05bf129-98d0-4259-b8f8-f86b1e68b084-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681615 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8008816-7c7f-48f2-9802-93e91e3faefc-logs\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681677 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681720 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-logs\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681745 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-config-data\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681772 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dstf7\" (UniqueName: \"kubernetes.io/projected/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-kube-api-access-dstf7\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681804 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681838 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb2tg\" (UniqueName: \"kubernetes.io/projected/f8008816-7c7f-48f2-9802-93e91e3faefc-kube-api-access-gb2tg\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681872 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-config-data\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.681891 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.759042 4728 scope.go:117] "RemoveContainer" containerID="bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.759612 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d\": container with ID starting with bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d not found: ID does not exist" containerID="bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.759659 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d"} err="failed to get container status \"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d\": rpc error: code = NotFound desc = could not find container \"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d\": container with ID starting with bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.759695 4728 scope.go:117] "RemoveContainer" containerID="244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a" Jan 25 05:56:00 crc kubenswrapper[4728]: E0125 05:56:00.759973 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a\": container with ID starting with 244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a not found: ID does not exist" containerID="244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.759993 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a"} err="failed to get container status \"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a\": rpc error: code = NotFound desc = could not find container \"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a\": container with ID starting with 244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.760006 4728 scope.go:117] "RemoveContainer" containerID="bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.760270 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d"} err="failed to get container status \"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d\": rpc error: code = NotFound desc = could not find container \"bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d\": container with ID starting with bf35e8a69e41394f18a1024ce46e069a46554c91a43a80ec1e2d9b09b65c501d not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.760293 4728 scope.go:117] "RemoveContainer" containerID="244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.760564 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a"} err="failed to get container status \"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a\": rpc error: code = NotFound desc = could not find container \"244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a\": container with ID starting with 244ac84405bdf8c53ea92b46d17a0a002e1d707e08803d145c1451728ab5c46a not found: ID does not exist" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.761919 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb49f8487-rfk6x"] Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.767462 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb49f8487-rfk6x"] Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.787615 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.787742 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb2tg\" (UniqueName: \"kubernetes.io/projected/f8008816-7c7f-48f2-9802-93e91e3faefc-kube-api-access-gb2tg\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.787871 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-config-data\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.787895 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.788178 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8008816-7c7f-48f2-9802-93e91e3faefc-logs\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.788242 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.788286 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-logs\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.788348 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-config-data\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.788397 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dstf7\" (UniqueName: \"kubernetes.io/projected/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-kube-api-access-dstf7\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.788706 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8008816-7c7f-48f2-9802-93e91e3faefc-logs\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.788804 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-logs\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.791395 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.792756 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.793009 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.802589 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb2tg\" (UniqueName: \"kubernetes.io/projected/f8008816-7c7f-48f2-9802-93e91e3faefc-kube-api-access-gb2tg\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.802776 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-config-data\") pod \"nova-api-0\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.804003 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-config-data\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.805360 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dstf7\" (UniqueName: \"kubernetes.io/projected/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-kube-api-access-dstf7\") pod \"nova-metadata-0\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " pod="openstack/nova-metadata-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.836033 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:00 crc kubenswrapper[4728]: I0125 05:56:00.853265 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.013183 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.095671 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-config-data\") pod \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.095823 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpf79\" (UniqueName: \"kubernetes.io/projected/6b9f35ef-850d-4957-aa3e-b3be97fc945f-kube-api-access-zpf79\") pod \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.095932 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-combined-ca-bundle\") pod \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\" (UID: \"6b9f35ef-850d-4957-aa3e-b3be97fc945f\") " Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.105225 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9f35ef-850d-4957-aa3e-b3be97fc945f-kube-api-access-zpf79" (OuterVolumeSpecName: "kube-api-access-zpf79") pod "6b9f35ef-850d-4957-aa3e-b3be97fc945f" (UID: "6b9f35ef-850d-4957-aa3e-b3be97fc945f"). InnerVolumeSpecName "kube-api-access-zpf79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.127266 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-config-data" (OuterVolumeSpecName: "config-data") pod "6b9f35ef-850d-4957-aa3e-b3be97fc945f" (UID: "6b9f35ef-850d-4957-aa3e-b3be97fc945f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.137330 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b9f35ef-850d-4957-aa3e-b3be97fc945f" (UID: "6b9f35ef-850d-4957-aa3e-b3be97fc945f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.198204 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.198436 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpf79\" (UniqueName: \"kubernetes.io/projected/6b9f35ef-850d-4957-aa3e-b3be97fc945f-kube-api-access-zpf79\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.198450 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f35ef-850d-4957-aa3e-b3be97fc945f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.316268 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:01 crc kubenswrapper[4728]: W0125 05:56:01.319940 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8008816_7c7f_48f2_9802_93e91e3faefc.slice/crio-6c1ed7ac2539d5b9f1b137e591aca6faebf005f75c09db38d0b6642de1ad2827 WatchSource:0}: Error finding container 6c1ed7ac2539d5b9f1b137e591aca6faebf005f75c09db38d0b6642de1ad2827: Status 404 returned error can't find the container with id 6c1ed7ac2539d5b9f1b137e591aca6faebf005f75c09db38d0b6642de1ad2827 Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.337418 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40634a2e-4d94-4f12-a5ab-c254f803bf16" path="/var/lib/kubelet/pods/40634a2e-4d94-4f12-a5ab-c254f803bf16/volumes" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.338202 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463497c7-9458-496e-ab52-19afdd182da4" path="/var/lib/kubelet/pods/463497c7-9458-496e-ab52-19afdd182da4/volumes" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.338792 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c05bf129-98d0-4259-b8f8-f86b1e68b084" path="/var/lib/kubelet/pods/c05bf129-98d0-4259-b8f8-f86b1e68b084/volumes" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.406503 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.432968 4728 generic.go:334] "Generic (PLEG): container finished" podID="6b9f35ef-850d-4957-aa3e-b3be97fc945f" containerID="150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae" exitCode=0 Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.433061 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b9f35ef-850d-4957-aa3e-b3be97fc945f","Type":"ContainerDied","Data":"150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae"} Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.433102 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6b9f35ef-850d-4957-aa3e-b3be97fc945f","Type":"ContainerDied","Data":"190bae129c3f25f579688dd82e1ed273de90adcfce2a1521638080ac304be37f"} Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.433124 4728 scope.go:117] "RemoveContainer" containerID="150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.433247 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.459357 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8008816-7c7f-48f2-9802-93e91e3faefc","Type":"ContainerStarted","Data":"6c1ed7ac2539d5b9f1b137e591aca6faebf005f75c09db38d0b6642de1ad2827"} Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.462368 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.470826 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.493353 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:01 crc kubenswrapper[4728]: E0125 05:56:01.493874 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9f35ef-850d-4957-aa3e-b3be97fc945f" containerName="nova-scheduler-scheduler" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.493957 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9f35ef-850d-4957-aa3e-b3be97fc945f" containerName="nova-scheduler-scheduler" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.494186 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9f35ef-850d-4957-aa3e-b3be97fc945f" containerName="nova-scheduler-scheduler" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.496024 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.500201 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.532550 4728 scope.go:117] "RemoveContainer" containerID="150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.532651 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:01 crc kubenswrapper[4728]: E0125 05:56:01.536432 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae\": container with ID starting with 150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae not found: ID does not exist" containerID="150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.536533 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae"} err="failed to get container status \"150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae\": rpc error: code = NotFound desc = could not find container \"150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae\": container with ID starting with 150cf72c63da8b1c401bb4dda88d3498a82b84297d4450bfdb0c9c126a7d46ae not found: ID does not exist" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.607065 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.607170 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ltxr\" (UniqueName: \"kubernetes.io/projected/8fed3aed-aad1-4995-95b5-fe247410707e-kube-api-access-8ltxr\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.607214 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-config-data\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.709193 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ltxr\" (UniqueName: \"kubernetes.io/projected/8fed3aed-aad1-4995-95b5-fe247410707e-kube-api-access-8ltxr\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.709548 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-config-data\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.709623 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.713788 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-config-data\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.714924 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.728498 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ltxr\" (UniqueName: \"kubernetes.io/projected/8fed3aed-aad1-4995-95b5-fe247410707e-kube-api-access-8ltxr\") pod \"nova-scheduler-0\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:01 crc kubenswrapper[4728]: I0125 05:56:01.823833 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.237242 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:02 crc kubenswrapper[4728]: W0125 05:56:02.243569 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fed3aed_aad1_4995_95b5_fe247410707e.slice/crio-9a3a652300ff57c3c65252f738657da4aebd48a46562b4eeaed95e1b370eddd7 WatchSource:0}: Error finding container 9a3a652300ff57c3c65252f738657da4aebd48a46562b4eeaed95e1b370eddd7: Status 404 returned error can't find the container with id 9a3a652300ff57c3c65252f738657da4aebd48a46562b4eeaed95e1b370eddd7 Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.476744 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8008816-7c7f-48f2-9802-93e91e3faefc","Type":"ContainerStarted","Data":"1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8"} Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.476814 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8008816-7c7f-48f2-9802-93e91e3faefc","Type":"ContainerStarted","Data":"eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d"} Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.479923 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254","Type":"ContainerStarted","Data":"3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3"} Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.480035 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254","Type":"ContainerStarted","Data":"2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59"} Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.480113 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254","Type":"ContainerStarted","Data":"2d35a9ff10e4750efebcb5b1a01c2b067eb9a12193c7a9070fc3893d443ae8ee"} Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.481596 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8fed3aed-aad1-4995-95b5-fe247410707e","Type":"ContainerStarted","Data":"fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848"} Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.481638 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8fed3aed-aad1-4995-95b5-fe247410707e","Type":"ContainerStarted","Data":"9a3a652300ff57c3c65252f738657da4aebd48a46562b4eeaed95e1b370eddd7"} Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.506523 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5065028099999997 podStartE2EDuration="2.50650281s" podCreationTimestamp="2026-01-25 05:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:02.489600335 +0000 UTC m=+1053.525478316" watchObservedRunningTime="2026-01-25 05:56:02.50650281 +0000 UTC m=+1053.542380790" Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.522280 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.522266417 podStartE2EDuration="2.522266417s" podCreationTimestamp="2026-01-25 05:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:02.51060166 +0000 UTC m=+1053.546479640" watchObservedRunningTime="2026-01-25 05:56:02.522266417 +0000 UTC m=+1053.558144397" Jan 25 05:56:02 crc kubenswrapper[4728]: I0125 05:56:02.531433 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.5313991599999999 podStartE2EDuration="1.53139916s" podCreationTimestamp="2026-01-25 05:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:02.522069476 +0000 UTC m=+1053.557947456" watchObservedRunningTime="2026-01-25 05:56:02.53139916 +0000 UTC m=+1053.567277140" Jan 25 05:56:03 crc kubenswrapper[4728]: I0125 05:56:03.340114 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9f35ef-850d-4957-aa3e-b3be97fc945f" path="/var/lib/kubelet/pods/6b9f35ef-850d-4957-aa3e-b3be97fc945f/volumes" Jan 25 05:56:05 crc kubenswrapper[4728]: I0125 05:56:05.467291 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 25 05:56:05 crc kubenswrapper[4728]: I0125 05:56:05.854075 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 25 05:56:05 crc kubenswrapper[4728]: I0125 05:56:05.854136 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 25 05:56:06 crc kubenswrapper[4728]: I0125 05:56:06.824362 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 25 05:56:08 crc kubenswrapper[4728]: I0125 05:56:08.745898 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.205091 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.205679 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2a691f37-33ba-4d5b-988a-f8417e8e630b" containerName="kube-state-metrics" containerID="cri-o://a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc" gracePeriod=30 Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.555839 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.557856 4728 generic.go:334] "Generic (PLEG): container finished" podID="2a691f37-33ba-4d5b-988a-f8417e8e630b" containerID="a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc" exitCode=2 Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.557892 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2a691f37-33ba-4d5b-988a-f8417e8e630b","Type":"ContainerDied","Data":"a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc"} Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.557938 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2a691f37-33ba-4d5b-988a-f8417e8e630b","Type":"ContainerDied","Data":"e135b18907eb20ada58cea6f074df86d4b19c797c99361bdfffde6ebb2e5693f"} Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.557954 4728 scope.go:117] "RemoveContainer" containerID="a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc" Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.580496 4728 scope.go:117] "RemoveContainer" containerID="a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc" Jan 25 05:56:09 crc kubenswrapper[4728]: E0125 05:56:09.581896 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc\": container with ID starting with a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc not found: ID does not exist" containerID="a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc" Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.581940 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc"} err="failed to get container status \"a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc\": rpc error: code = NotFound desc = could not find container \"a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc\": container with ID starting with a885c594561867ccc643f2aed7801aa0410d2bdc20afe87cc7b32edf9ba43ccc not found: ID does not exist" Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.663630 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf5kb\" (UniqueName: \"kubernetes.io/projected/2a691f37-33ba-4d5b-988a-f8417e8e630b-kube-api-access-wf5kb\") pod \"2a691f37-33ba-4d5b-988a-f8417e8e630b\" (UID: \"2a691f37-33ba-4d5b-988a-f8417e8e630b\") " Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.669519 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a691f37-33ba-4d5b-988a-f8417e8e630b-kube-api-access-wf5kb" (OuterVolumeSpecName: "kube-api-access-wf5kb") pod "2a691f37-33ba-4d5b-988a-f8417e8e630b" (UID: "2a691f37-33ba-4d5b-988a-f8417e8e630b"). InnerVolumeSpecName "kube-api-access-wf5kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:09 crc kubenswrapper[4728]: I0125 05:56:09.765594 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf5kb\" (UniqueName: \"kubernetes.io/projected/2a691f37-33ba-4d5b-988a-f8417e8e630b-kube-api-access-wf5kb\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.567540 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.602306 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.609047 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.619846 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:56:10 crc kubenswrapper[4728]: E0125 05:56:10.622133 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a691f37-33ba-4d5b-988a-f8417e8e630b" containerName="kube-state-metrics" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.622599 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a691f37-33ba-4d5b-988a-f8417e8e630b" containerName="kube-state-metrics" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.622904 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a691f37-33ba-4d5b-988a-f8417e8e630b" containerName="kube-state-metrics" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.623780 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.626944 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.626994 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.630340 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.785472 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.785801 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xht92\" (UniqueName: \"kubernetes.io/projected/c45b9d32-afe0-490e-876d-64a9359773ff-kube-api-access-xht92\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.785860 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.785898 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.837369 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.837417 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.854509 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.854697 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.886753 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.886806 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.886900 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.886940 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xht92\" (UniqueName: \"kubernetes.io/projected/c45b9d32-afe0-490e-876d-64a9359773ff-kube-api-access-xht92\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.891844 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.894925 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.901176 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45b9d32-afe0-490e-876d-64a9359773ff-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.903641 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xht92\" (UniqueName: \"kubernetes.io/projected/c45b9d32-afe0-490e-876d-64a9359773ff-kube-api-access-xht92\") pod \"kube-state-metrics-0\" (UID: \"c45b9d32-afe0-490e-876d-64a9359773ff\") " pod="openstack/kube-state-metrics-0" Jan 25 05:56:10 crc kubenswrapper[4728]: I0125 05:56:10.936443 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.281193 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.281810 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="ceilometer-central-agent" containerID="cri-o://cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3" gracePeriod=30 Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.282240 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="ceilometer-notification-agent" containerID="cri-o://8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4" gracePeriod=30 Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.282407 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="sg-core" containerID="cri-o://f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3" gracePeriod=30 Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.286473 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="proxy-httpd" containerID="cri-o://87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4" gracePeriod=30 Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.337551 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a691f37-33ba-4d5b-988a-f8417e8e630b" path="/var/lib/kubelet/pods/2a691f37-33ba-4d5b-988a-f8417e8e630b/volumes" Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.345165 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 25 05:56:11 crc kubenswrapper[4728]: W0125 05:56:11.345758 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc45b9d32_afe0_490e_876d_64a9359773ff.slice/crio-e6290129bdba8162ca4d3d394bfd9edc1c85062ac90b8f6404e0d3803e064c45 WatchSource:0}: Error finding container e6290129bdba8162ca4d3d394bfd9edc1c85062ac90b8f6404e0d3803e064c45: Status 404 returned error can't find the container with id e6290129bdba8162ca4d3d394bfd9edc1c85062ac90b8f6404e0d3803e064c45 Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.574103 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c45b9d32-afe0-490e-876d-64a9359773ff","Type":"ContainerStarted","Data":"e6290129bdba8162ca4d3d394bfd9edc1c85062ac90b8f6404e0d3803e064c45"} Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.576274 4728 generic.go:334] "Generic (PLEG): container finished" podID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerID="87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4" exitCode=0 Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.576298 4728 generic.go:334] "Generic (PLEG): container finished" podID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerID="f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3" exitCode=2 Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.576886 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerDied","Data":"87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4"} Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.576914 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerDied","Data":"f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3"} Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.824444 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.854043 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.932427 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.932525 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.932585 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:11 crc kubenswrapper[4728]: I0125 05:56:11.932623 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.585471 4728 generic.go:334] "Generic (PLEG): container finished" podID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerID="cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3" exitCode=0 Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.585520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerDied","Data":"cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3"} Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.587205 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c45b9d32-afe0-490e-876d-64a9359773ff","Type":"ContainerStarted","Data":"3ed7336416992c5313ad7239bfe8e1c988fc13869a844857306021e193742e67"} Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.606937 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.337536803 podStartE2EDuration="2.606921489s" podCreationTimestamp="2026-01-25 05:56:10 +0000 UTC" firstStartedPulling="2026-01-25 05:56:11.347450792 +0000 UTC m=+1062.383328771" lastFinishedPulling="2026-01-25 05:56:11.616835477 +0000 UTC m=+1062.652713457" observedRunningTime="2026-01-25 05:56:12.600377087 +0000 UTC m=+1063.636255068" watchObservedRunningTime="2026-01-25 05:56:12.606921489 +0000 UTC m=+1063.642799469" Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.612536 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.899020 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.899346 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.899397 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.900666 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cfcdf54d823ad6beb0133f29e917610e444d0fa6cfe06f430b6751fe7dbea675"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 05:56:12 crc kubenswrapper[4728]: I0125 05:56:12.900735 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://cfcdf54d823ad6beb0133f29e917610e444d0fa6cfe06f430b6751fe7dbea675" gracePeriod=600 Jan 25 05:56:13 crc kubenswrapper[4728]: I0125 05:56:13.607197 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="cfcdf54d823ad6beb0133f29e917610e444d0fa6cfe06f430b6751fe7dbea675" exitCode=0 Jan 25 05:56:13 crc kubenswrapper[4728]: I0125 05:56:13.607337 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"cfcdf54d823ad6beb0133f29e917610e444d0fa6cfe06f430b6751fe7dbea675"} Jan 25 05:56:13 crc kubenswrapper[4728]: I0125 05:56:13.607849 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"fd316250bf57712586994889b62bcccbaedbf4eba29b23e84c2d634ac0c7e82a"} Jan 25 05:56:13 crc kubenswrapper[4728]: I0125 05:56:13.607868 4728 scope.go:117] "RemoveContainer" containerID="5b2523b29483494490949dbc53c4bdb9d3c9b4b7a93fe4055f11cc91a7d873b4" Jan 25 05:56:13 crc kubenswrapper[4728]: I0125 05:56:13.608576 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.592291 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.653825 4728 generic.go:334] "Generic (PLEG): container finished" podID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerID="8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4" exitCode=0 Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.653875 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.653881 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerDied","Data":"8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4"} Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.653921 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7990d8bf-fbf2-479c-9bfb-690fa3141dad","Type":"ContainerDied","Data":"d2bf1a971dda651d95fd20610bb321b8aa8424a8a7a8ee8962a646661e40d202"} Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.653946 4728 scope.go:117] "RemoveContainer" containerID="87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.671109 4728 scope.go:117] "RemoveContainer" containerID="f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.685437 4728 scope.go:117] "RemoveContainer" containerID="8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.703072 4728 scope.go:117] "RemoveContainer" containerID="cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.719022 4728 scope.go:117] "RemoveContainer" containerID="87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4" Jan 25 05:56:18 crc kubenswrapper[4728]: E0125 05:56:18.719450 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4\": container with ID starting with 87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4 not found: ID does not exist" containerID="87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.719498 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4"} err="failed to get container status \"87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4\": rpc error: code = NotFound desc = could not find container \"87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4\": container with ID starting with 87ae94fd4156ce0283739acb544d7178ac9311217740b418f0a462c9c5b5c2b4 not found: ID does not exist" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.719533 4728 scope.go:117] "RemoveContainer" containerID="f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3" Jan 25 05:56:18 crc kubenswrapper[4728]: E0125 05:56:18.720121 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3\": container with ID starting with f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3 not found: ID does not exist" containerID="f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.720189 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3"} err="failed to get container status \"f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3\": rpc error: code = NotFound desc = could not find container \"f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3\": container with ID starting with f7721e6cda990fe67278ccffa2fe8ce54d5f73cdca7c56f7bb28cd52d723f3e3 not found: ID does not exist" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.720208 4728 scope.go:117] "RemoveContainer" containerID="8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4" Jan 25 05:56:18 crc kubenswrapper[4728]: E0125 05:56:18.720618 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4\": container with ID starting with 8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4 not found: ID does not exist" containerID="8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.720656 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4"} err="failed to get container status \"8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4\": rpc error: code = NotFound desc = could not find container \"8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4\": container with ID starting with 8bc5a71cc3a55b431a7ca7a77321a37334e748046179898f7e03112a4469a5d4 not found: ID does not exist" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.720686 4728 scope.go:117] "RemoveContainer" containerID="cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3" Jan 25 05:56:18 crc kubenswrapper[4728]: E0125 05:56:18.721083 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3\": container with ID starting with cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3 not found: ID does not exist" containerID="cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.721106 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3"} err="failed to get container status \"cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3\": rpc error: code = NotFound desc = could not find container \"cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3\": container with ID starting with cc33a7689ad91f210ae700179c21222cfd6d398f8df85be73a483025dfa314b3 not found: ID does not exist" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.743017 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-combined-ca-bundle\") pod \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.743790 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-config-data\") pod \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.743840 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-log-httpd\") pod \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.743901 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-sg-core-conf-yaml\") pod \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.744056 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-scripts\") pod \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.744184 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-run-httpd\") pod \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.744228 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9k7t\" (UniqueName: \"kubernetes.io/projected/7990d8bf-fbf2-479c-9bfb-690fa3141dad-kube-api-access-z9k7t\") pod \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\" (UID: \"7990d8bf-fbf2-479c-9bfb-690fa3141dad\") " Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.744684 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7990d8bf-fbf2-479c-9bfb-690fa3141dad" (UID: "7990d8bf-fbf2-479c-9bfb-690fa3141dad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.744791 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7990d8bf-fbf2-479c-9bfb-690fa3141dad" (UID: "7990d8bf-fbf2-479c-9bfb-690fa3141dad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.745812 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.745845 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7990d8bf-fbf2-479c-9bfb-690fa3141dad-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.750874 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7990d8bf-fbf2-479c-9bfb-690fa3141dad-kube-api-access-z9k7t" (OuterVolumeSpecName: "kube-api-access-z9k7t") pod "7990d8bf-fbf2-479c-9bfb-690fa3141dad" (UID: "7990d8bf-fbf2-479c-9bfb-690fa3141dad"). InnerVolumeSpecName "kube-api-access-z9k7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.751052 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-scripts" (OuterVolumeSpecName: "scripts") pod "7990d8bf-fbf2-479c-9bfb-690fa3141dad" (UID: "7990d8bf-fbf2-479c-9bfb-690fa3141dad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.772101 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7990d8bf-fbf2-479c-9bfb-690fa3141dad" (UID: "7990d8bf-fbf2-479c-9bfb-690fa3141dad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.821105 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7990d8bf-fbf2-479c-9bfb-690fa3141dad" (UID: "7990d8bf-fbf2-479c-9bfb-690fa3141dad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.821574 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-config-data" (OuterVolumeSpecName: "config-data") pod "7990d8bf-fbf2-479c-9bfb-690fa3141dad" (UID: "7990d8bf-fbf2-479c-9bfb-690fa3141dad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.849370 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9k7t\" (UniqueName: \"kubernetes.io/projected/7990d8bf-fbf2-479c-9bfb-690fa3141dad-kube-api-access-z9k7t\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.849412 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.849425 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.849440 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.849451 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7990d8bf-fbf2-479c-9bfb-690fa3141dad-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:18 crc kubenswrapper[4728]: I0125 05:56:18.991354 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.001244 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016083 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:19 crc kubenswrapper[4728]: E0125 05:56:19.016556 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="ceilometer-notification-agent" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016579 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="ceilometer-notification-agent" Jan 25 05:56:19 crc kubenswrapper[4728]: E0125 05:56:19.016595 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="proxy-httpd" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016602 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="proxy-httpd" Jan 25 05:56:19 crc kubenswrapper[4728]: E0125 05:56:19.016613 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="sg-core" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016619 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="sg-core" Jan 25 05:56:19 crc kubenswrapper[4728]: E0125 05:56:19.016630 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="ceilometer-central-agent" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016638 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="ceilometer-central-agent" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016873 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="sg-core" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016898 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="ceilometer-notification-agent" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016917 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="proxy-httpd" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.016927 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" containerName="ceilometer-central-agent" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.019165 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.021116 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.021296 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.021637 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.033403 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.052184 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-scripts\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.052251 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-config-data\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.052365 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-log-httpd\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.052386 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfbvk\" (UniqueName: \"kubernetes.io/projected/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-kube-api-access-pfbvk\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.052413 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.052458 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.052480 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-run-httpd\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.052505 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.154367 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-config-data\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.154506 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-log-httpd\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.154531 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfbvk\" (UniqueName: \"kubernetes.io/projected/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-kube-api-access-pfbvk\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.154565 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.154635 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.154657 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-run-httpd\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.154690 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.154801 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-scripts\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.155356 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-run-httpd\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.155395 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-log-httpd\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.159932 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.160061 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-config-data\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.161332 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.163698 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-scripts\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.163736 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.170086 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfbvk\" (UniqueName: \"kubernetes.io/projected/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-kube-api-access-pfbvk\") pod \"ceilometer-0\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.340004 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7990d8bf-fbf2-479c-9bfb-690fa3141dad" path="/var/lib/kubelet/pods/7990d8bf-fbf2-479c-9bfb-690fa3141dad/volumes" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.341265 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:56:19 crc kubenswrapper[4728]: I0125 05:56:19.737603 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.678652 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerStarted","Data":"246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba"} Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.679153 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerStarted","Data":"2035b3aa49189f6eafd20d2f0c4635892909f128643019f1fbd894f184a6c294"} Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.842942 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.843335 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.844208 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.846644 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.865114 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.865688 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.877449 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 25 05:56:20 crc kubenswrapper[4728]: I0125 05:56:20.946530 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 25 05:56:21 crc kubenswrapper[4728]: I0125 05:56:21.687871 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerStarted","Data":"23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2"} Jan 25 05:56:21 crc kubenswrapper[4728]: I0125 05:56:21.688172 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 25 05:56:21 crc kubenswrapper[4728]: I0125 05:56:21.692217 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 25 05:56:21 crc kubenswrapper[4728]: I0125 05:56:21.696036 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 25 05:56:21 crc kubenswrapper[4728]: I0125 05:56:21.863282 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fd57d5db9-z9qlh"] Jan 25 05:56:21 crc kubenswrapper[4728]: I0125 05:56:21.864915 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:21 crc kubenswrapper[4728]: I0125 05:56:21.893785 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd57d5db9-z9qlh"] Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.023083 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-svc\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.023368 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-swift-storage-0\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.023443 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-config\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.023468 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-sb\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.023498 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-nb\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.023566 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx6w6\" (UniqueName: \"kubernetes.io/projected/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-kube-api-access-sx6w6\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.126271 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-config\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.126357 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-sb\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.126412 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-nb\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.126634 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx6w6\" (UniqueName: \"kubernetes.io/projected/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-kube-api-access-sx6w6\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.126686 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-svc\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.126781 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-swift-storage-0\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.127763 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-config\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.127841 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-swift-storage-0\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.127891 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-sb\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.128063 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-svc\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.128504 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-nb\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.147114 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx6w6\" (UniqueName: \"kubernetes.io/projected/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-kube-api-access-sx6w6\") pod \"dnsmasq-dns-fd57d5db9-z9qlh\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.254884 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.695759 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerStarted","Data":"f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c"} Jan 25 05:56:22 crc kubenswrapper[4728]: I0125 05:56:22.719290 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd57d5db9-z9qlh"] Jan 25 05:56:23 crc kubenswrapper[4728]: E0125 05:56:23.123885 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bc39392_7c35_4e6d_b06d_d0e6679bcd87.slice/crio-conmon-6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef.scope\": RecentStats: unable to find data in memory cache]" Jan 25 05:56:23 crc kubenswrapper[4728]: I0125 05:56:23.712667 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerStarted","Data":"2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9"} Jan 25 05:56:23 crc kubenswrapper[4728]: I0125 05:56:23.713121 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 25 05:56:23 crc kubenswrapper[4728]: I0125 05:56:23.715290 4728 generic.go:334] "Generic (PLEG): container finished" podID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" containerID="6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef" exitCode=0 Jan 25 05:56:23 crc kubenswrapper[4728]: I0125 05:56:23.715428 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" event={"ID":"4bc39392-7c35-4e6d-b06d-d0e6679bcd87","Type":"ContainerDied","Data":"6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef"} Jan 25 05:56:23 crc kubenswrapper[4728]: I0125 05:56:23.715527 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" event={"ID":"4bc39392-7c35-4e6d-b06d-d0e6679bcd87","Type":"ContainerStarted","Data":"d5899acfee929c19a305cdf7df82a35766dcfa8740ddac3e91f7491b6350c006"} Jan 25 05:56:23 crc kubenswrapper[4728]: I0125 05:56:23.734832 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.265508336 podStartE2EDuration="5.734812734s" podCreationTimestamp="2026-01-25 05:56:18 +0000 UTC" firstStartedPulling="2026-01-25 05:56:19.745575789 +0000 UTC m=+1070.781453769" lastFinishedPulling="2026-01-25 05:56:23.214880186 +0000 UTC m=+1074.250758167" observedRunningTime="2026-01-25 05:56:23.728663708 +0000 UTC m=+1074.764541688" watchObservedRunningTime="2026-01-25 05:56:23.734812734 +0000 UTC m=+1074.770690714" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.101515 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.677726 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.725050 4728 generic.go:334] "Generic (PLEG): container finished" podID="018525d6-89b2-4f6f-8833-c60a2e82aa86" containerID="10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a" exitCode=137 Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.725152 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"018525d6-89b2-4f6f-8833-c60a2e82aa86","Type":"ContainerDied","Data":"10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a"} Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.725186 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"018525d6-89b2-4f6f-8833-c60a2e82aa86","Type":"ContainerDied","Data":"4afbd8aa8bc85c7476a5dccae4f44f56b9f397be1e434eff59225db6f4f25f10"} Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.725202 4728 scope.go:117] "RemoveContainer" containerID="10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.725356 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.732646 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" event={"ID":"4bc39392-7c35-4e6d-b06d-d0e6679bcd87","Type":"ContainerStarted","Data":"e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c"} Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.733411 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-log" containerID="cri-o://eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d" gracePeriod=30 Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.733557 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-api" containerID="cri-o://1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8" gracePeriod=30 Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.733795 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.758637 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" podStartSLOduration=3.75861609 podStartE2EDuration="3.75861609s" podCreationTimestamp="2026-01-25 05:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:24.755198916 +0000 UTC m=+1075.791076896" watchObservedRunningTime="2026-01-25 05:56:24.75861609 +0000 UTC m=+1075.794494070" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.765762 4728 scope.go:117] "RemoveContainer" containerID="10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a" Jan 25 05:56:24 crc kubenswrapper[4728]: E0125 05:56:24.768728 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a\": container with ID starting with 10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a not found: ID does not exist" containerID="10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.768798 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a"} err="failed to get container status \"10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a\": rpc error: code = NotFound desc = could not find container \"10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a\": container with ID starting with 10c967a45755382466fe3c68e7d993755dfa7584d67ffb317711ec62b362f93a not found: ID does not exist" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.812714 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-combined-ca-bundle\") pod \"018525d6-89b2-4f6f-8833-c60a2e82aa86\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.812785 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tb2k\" (UniqueName: \"kubernetes.io/projected/018525d6-89b2-4f6f-8833-c60a2e82aa86-kube-api-access-4tb2k\") pod \"018525d6-89b2-4f6f-8833-c60a2e82aa86\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.812859 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-config-data\") pod \"018525d6-89b2-4f6f-8833-c60a2e82aa86\" (UID: \"018525d6-89b2-4f6f-8833-c60a2e82aa86\") " Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.828749 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018525d6-89b2-4f6f-8833-c60a2e82aa86-kube-api-access-4tb2k" (OuterVolumeSpecName: "kube-api-access-4tb2k") pod "018525d6-89b2-4f6f-8833-c60a2e82aa86" (UID: "018525d6-89b2-4f6f-8833-c60a2e82aa86"). InnerVolumeSpecName "kube-api-access-4tb2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.838535 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-config-data" (OuterVolumeSpecName: "config-data") pod "018525d6-89b2-4f6f-8833-c60a2e82aa86" (UID: "018525d6-89b2-4f6f-8833-c60a2e82aa86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.842965 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "018525d6-89b2-4f6f-8833-c60a2e82aa86" (UID: "018525d6-89b2-4f6f-8833-c60a2e82aa86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.916763 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.917691 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tb2k\" (UniqueName: \"kubernetes.io/projected/018525d6-89b2-4f6f-8833-c60a2e82aa86-kube-api-access-4tb2k\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:24 crc kubenswrapper[4728]: I0125 05:56:24.917753 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018525d6-89b2-4f6f-8833-c60a2e82aa86-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.056656 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.066143 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.078792 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:56:25 crc kubenswrapper[4728]: E0125 05:56:25.079491 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018525d6-89b2-4f6f-8833-c60a2e82aa86" containerName="nova-cell1-novncproxy-novncproxy" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.079520 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="018525d6-89b2-4f6f-8833-c60a2e82aa86" containerName="nova-cell1-novncproxy-novncproxy" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.079773 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="018525d6-89b2-4f6f-8833-c60a2e82aa86" containerName="nova-cell1-novncproxy-novncproxy" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.080745 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.085621 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.085754 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.085985 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.096174 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.120421 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.120473 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wm9w\" (UniqueName: \"kubernetes.io/projected/99b7a342-4f5c-4977-b189-b0e4cf975704-kube-api-access-8wm9w\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.120509 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.120548 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.120599 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.222663 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.222773 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wm9w\" (UniqueName: \"kubernetes.io/projected/99b7a342-4f5c-4977-b189-b0e4cf975704-kube-api-access-8wm9w\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.222843 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.222916 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.223037 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.227589 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.228054 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.229114 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.239865 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b7a342-4f5c-4977-b189-b0e4cf975704-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.241142 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wm9w\" (UniqueName: \"kubernetes.io/projected/99b7a342-4f5c-4977-b189-b0e4cf975704-kube-api-access-8wm9w\") pod \"nova-cell1-novncproxy-0\" (UID: \"99b7a342-4f5c-4977-b189-b0e4cf975704\") " pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.340491 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018525d6-89b2-4f6f-8833-c60a2e82aa86" path="/var/lib/kubelet/pods/018525d6-89b2-4f6f-8833-c60a2e82aa86/volumes" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.407208 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.746283 4728 generic.go:334] "Generic (PLEG): container finished" podID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerID="eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d" exitCode=143 Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.746421 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8008816-7c7f-48f2-9802-93e91e3faefc","Type":"ContainerDied","Data":"eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d"} Jan 25 05:56:25 crc kubenswrapper[4728]: I0125 05:56:25.829036 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.000031 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.000255 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="ceilometer-central-agent" containerID="cri-o://246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba" gracePeriod=30 Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.000288 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="proxy-httpd" containerID="cri-o://2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9" gracePeriod=30 Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.000366 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="sg-core" containerID="cri-o://f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c" gracePeriod=30 Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.000398 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="ceilometer-notification-agent" containerID="cri-o://23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2" gracePeriod=30 Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.775691 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"99b7a342-4f5c-4977-b189-b0e4cf975704","Type":"ContainerStarted","Data":"ef50c6dcb717fee0cb1a01a77b457fd75dc5f12a530258d6389dcaa8fe22dc7c"} Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.776071 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"99b7a342-4f5c-4977-b189-b0e4cf975704","Type":"ContainerStarted","Data":"7d3ef1306bf88d402acdb2c1a06380ff3adcc8f827d6e79eb0aca384c12bfeb4"} Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.779148 4728 generic.go:334] "Generic (PLEG): container finished" podID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerID="2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9" exitCode=0 Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.779185 4728 generic.go:334] "Generic (PLEG): container finished" podID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerID="f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c" exitCode=2 Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.779196 4728 generic.go:334] "Generic (PLEG): container finished" podID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerID="23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2" exitCode=0 Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.779223 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerDied","Data":"2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9"} Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.779260 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerDied","Data":"f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c"} Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.779272 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerDied","Data":"23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2"} Jan 25 05:56:26 crc kubenswrapper[4728]: I0125 05:56:26.798972 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.798952432 podStartE2EDuration="1.798952432s" podCreationTimestamp="2026-01-25 05:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:26.790411666 +0000 UTC m=+1077.826289646" watchObservedRunningTime="2026-01-25 05:56:26.798952432 +0000 UTC m=+1077.834830412" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.299121 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.397732 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-config-data\") pod \"f8008816-7c7f-48f2-9802-93e91e3faefc\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.397826 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8008816-7c7f-48f2-9802-93e91e3faefc-logs\") pod \"f8008816-7c7f-48f2-9802-93e91e3faefc\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.397894 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-combined-ca-bundle\") pod \"f8008816-7c7f-48f2-9802-93e91e3faefc\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.397962 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb2tg\" (UniqueName: \"kubernetes.io/projected/f8008816-7c7f-48f2-9802-93e91e3faefc-kube-api-access-gb2tg\") pod \"f8008816-7c7f-48f2-9802-93e91e3faefc\" (UID: \"f8008816-7c7f-48f2-9802-93e91e3faefc\") " Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.398951 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8008816-7c7f-48f2-9802-93e91e3faefc-logs" (OuterVolumeSpecName: "logs") pod "f8008816-7c7f-48f2-9802-93e91e3faefc" (UID: "f8008816-7c7f-48f2-9802-93e91e3faefc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.418262 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8008816-7c7f-48f2-9802-93e91e3faefc-kube-api-access-gb2tg" (OuterVolumeSpecName: "kube-api-access-gb2tg") pod "f8008816-7c7f-48f2-9802-93e91e3faefc" (UID: "f8008816-7c7f-48f2-9802-93e91e3faefc"). InnerVolumeSpecName "kube-api-access-gb2tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.429561 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-config-data" (OuterVolumeSpecName: "config-data") pod "f8008816-7c7f-48f2-9802-93e91e3faefc" (UID: "f8008816-7c7f-48f2-9802-93e91e3faefc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.431410 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8008816-7c7f-48f2-9802-93e91e3faefc" (UID: "f8008816-7c7f-48f2-9802-93e91e3faefc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.500433 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.500461 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8008816-7c7f-48f2-9802-93e91e3faefc-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.500471 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8008816-7c7f-48f2-9802-93e91e3faefc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.500481 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb2tg\" (UniqueName: \"kubernetes.io/projected/f8008816-7c7f-48f2-9802-93e91e3faefc-kube-api-access-gb2tg\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.801766 4728 generic.go:334] "Generic (PLEG): container finished" podID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerID="1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8" exitCode=0 Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.801817 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8008816-7c7f-48f2-9802-93e91e3faefc","Type":"ContainerDied","Data":"1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8"} Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.801957 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8008816-7c7f-48f2-9802-93e91e3faefc","Type":"ContainerDied","Data":"6c1ed7ac2539d5b9f1b137e591aca6faebf005f75c09db38d0b6642de1ad2827"} Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.802003 4728 scope.go:117] "RemoveContainer" containerID="1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.802775 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.842281 4728 scope.go:117] "RemoveContainer" containerID="eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.848139 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.861140 4728 scope.go:117] "RemoveContainer" containerID="1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8" Jan 25 05:56:28 crc kubenswrapper[4728]: E0125 05:56:28.861663 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8\": container with ID starting with 1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8 not found: ID does not exist" containerID="1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.861773 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8"} err="failed to get container status \"1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8\": rpc error: code = NotFound desc = could not find container \"1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8\": container with ID starting with 1cd9a371a53025dc8e52894503399900f85d66a6f4dd5db29fba525cd829b4f8 not found: ID does not exist" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.861848 4728 scope.go:117] "RemoveContainer" containerID="eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d" Jan 25 05:56:28 crc kubenswrapper[4728]: E0125 05:56:28.862159 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d\": container with ID starting with eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d not found: ID does not exist" containerID="eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.862255 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d"} err="failed to get container status \"eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d\": rpc error: code = NotFound desc = could not find container \"eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d\": container with ID starting with eef802a2f41a9a06d0d1f20b215e37ef422d2f54e8b288ded326c29634e2056d not found: ID does not exist" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.862355 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.872050 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:28 crc kubenswrapper[4728]: E0125 05:56:28.872520 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-log" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.872542 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-log" Jan 25 05:56:28 crc kubenswrapper[4728]: E0125 05:56:28.872592 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-api" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.872600 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-api" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.872825 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-log" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.872843 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" containerName="nova-api-api" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.877713 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.879674 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.880645 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.880948 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 25 05:56:28 crc kubenswrapper[4728]: I0125 05:56:28.887771 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.010883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-config-data\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.010954 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-public-tls-certs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.011184 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.011408 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.011480 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4lq2\" (UniqueName: \"kubernetes.io/projected/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-kube-api-access-p4lq2\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.011709 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-logs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.115050 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-config-data\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.115132 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-public-tls-certs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.115201 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.115290 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.115391 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4lq2\" (UniqueName: \"kubernetes.io/projected/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-kube-api-access-p4lq2\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.115565 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-logs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.116148 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-logs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.120942 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-public-tls-certs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.121156 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.121462 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.122259 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-config-data\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.133612 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4lq2\" (UniqueName: \"kubernetes.io/projected/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-kube-api-access-p4lq2\") pod \"nova-api-0\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.194554 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.350428 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8008816-7c7f-48f2-9802-93e91e3faefc" path="/var/lib/kubelet/pods/f8008816-7c7f-48f2-9802-93e91e3faefc/volumes" Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.629021 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.812343 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26","Type":"ContainerStarted","Data":"6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1"} Jan 25 05:56:29 crc kubenswrapper[4728]: I0125 05:56:29.812440 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26","Type":"ContainerStarted","Data":"ec7cdf8bd3e48c75e0d01012424ae314167706d826bee1cf76b704e581069484"} Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.204236 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.240806 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-combined-ca-bundle\") pod \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.240862 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-config-data\") pod \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.240919 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-run-httpd\") pod \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.241007 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfbvk\" (UniqueName: \"kubernetes.io/projected/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-kube-api-access-pfbvk\") pod \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.241096 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-scripts\") pod \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.241139 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-sg-core-conf-yaml\") pod \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.241160 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-log-httpd\") pod \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.241261 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" (UID: "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.241375 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-ceilometer-tls-certs\") pod \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\" (UID: \"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246\") " Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.241608 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" (UID: "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.242181 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.242217 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.248987 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-kube-api-access-pfbvk" (OuterVolumeSpecName: "kube-api-access-pfbvk") pod "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" (UID: "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246"). InnerVolumeSpecName "kube-api-access-pfbvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.249091 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-scripts" (OuterVolumeSpecName: "scripts") pod "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" (UID: "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.264435 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" (UID: "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.286971 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" (UID: "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.313168 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-config-data" (OuterVolumeSpecName: "config-data") pod "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" (UID: "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.318031 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" (UID: "abdbbbb6-18f9-4dc6-bb9e-46e966c6b246"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.344080 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.344119 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.344133 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.344153 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.344167 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.344181 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfbvk\" (UniqueName: \"kubernetes.io/projected/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246-kube-api-access-pfbvk\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.407432 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.823194 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26","Type":"ContainerStarted","Data":"1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83"} Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.827266 4728 generic.go:334] "Generic (PLEG): container finished" podID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerID="246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba" exitCode=0 Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.827349 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerDied","Data":"246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba"} Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.827436 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"abdbbbb6-18f9-4dc6-bb9e-46e966c6b246","Type":"ContainerDied","Data":"2035b3aa49189f6eafd20d2f0c4635892909f128643019f1fbd894f184a6c294"} Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.827463 4728 scope.go:117] "RemoveContainer" containerID="2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.827375 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.847141 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.847122443 podStartE2EDuration="2.847122443s" podCreationTimestamp="2026-01-25 05:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:30.837611006 +0000 UTC m=+1081.873488986" watchObservedRunningTime="2026-01-25 05:56:30.847122443 +0000 UTC m=+1081.883000423" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.848652 4728 scope.go:117] "RemoveContainer" containerID="f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.866030 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.871264 4728 scope.go:117] "RemoveContainer" containerID="23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.877795 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.885765 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:30 crc kubenswrapper[4728]: E0125 05:56:30.886204 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="ceilometer-central-agent" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.886225 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="ceilometer-central-agent" Jan 25 05:56:30 crc kubenswrapper[4728]: E0125 05:56:30.886257 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="sg-core" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.886264 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="sg-core" Jan 25 05:56:30 crc kubenswrapper[4728]: E0125 05:56:30.886273 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="proxy-httpd" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.886279 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="proxy-httpd" Jan 25 05:56:30 crc kubenswrapper[4728]: E0125 05:56:30.886287 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="ceilometer-notification-agent" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.886293 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="ceilometer-notification-agent" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.886525 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="ceilometer-notification-agent" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.886544 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="proxy-httpd" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.886562 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="sg-core" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.886580 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" containerName="ceilometer-central-agent" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.888406 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.890878 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.891172 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.891254 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.893453 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.896802 4728 scope.go:117] "RemoveContainer" containerID="246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.923131 4728 scope.go:117] "RemoveContainer" containerID="2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9" Jan 25 05:56:30 crc kubenswrapper[4728]: E0125 05:56:30.923518 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9\": container with ID starting with 2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9 not found: ID does not exist" containerID="2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.923571 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9"} err="failed to get container status \"2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9\": rpc error: code = NotFound desc = could not find container \"2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9\": container with ID starting with 2f972704ccf853c15c8781896430b2e57f7de0dc7b41d3140ddaff102784bad9 not found: ID does not exist" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.923599 4728 scope.go:117] "RemoveContainer" containerID="f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c" Jan 25 05:56:30 crc kubenswrapper[4728]: E0125 05:56:30.923925 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c\": container with ID starting with f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c not found: ID does not exist" containerID="f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.923946 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c"} err="failed to get container status \"f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c\": rpc error: code = NotFound desc = could not find container \"f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c\": container with ID starting with f6e2c315f45e680ccafc92ed9ee43e3e0bf401289492c2f36c8e4f8fbfbbf14c not found: ID does not exist" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.923983 4728 scope.go:117] "RemoveContainer" containerID="23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2" Jan 25 05:56:30 crc kubenswrapper[4728]: E0125 05:56:30.924275 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2\": container with ID starting with 23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2 not found: ID does not exist" containerID="23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.924294 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2"} err="failed to get container status \"23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2\": rpc error: code = NotFound desc = could not find container \"23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2\": container with ID starting with 23005742071f1759ee4760f3e78c0d6ddcfd57dbab1ff7518474f3cb81f383a2 not found: ID does not exist" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.924308 4728 scope.go:117] "RemoveContainer" containerID="246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba" Jan 25 05:56:30 crc kubenswrapper[4728]: E0125 05:56:30.924896 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba\": container with ID starting with 246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba not found: ID does not exist" containerID="246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.924944 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba"} err="failed to get container status \"246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba\": rpc error: code = NotFound desc = could not find container \"246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba\": container with ID starting with 246e2fe9516df9a01c43dea5cfc31af2f157ac19e161f16c9a36a6f51563dbba not found: ID does not exist" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.961702 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-config-data\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.961859 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05cf94b4-4884-4e05-9036-3676fb8aedcb-run-httpd\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.962009 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-scripts\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.962047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.962133 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knwn2\" (UniqueName: \"kubernetes.io/projected/05cf94b4-4884-4e05-9036-3676fb8aedcb-kube-api-access-knwn2\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.962182 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.962221 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:30 crc kubenswrapper[4728]: I0125 05:56:30.962263 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05cf94b4-4884-4e05-9036-3676fb8aedcb-log-httpd\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063128 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-config-data\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063199 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05cf94b4-4884-4e05-9036-3676fb8aedcb-run-httpd\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063244 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-scripts\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063268 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063294 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knwn2\" (UniqueName: \"kubernetes.io/projected/05cf94b4-4884-4e05-9036-3676fb8aedcb-kube-api-access-knwn2\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063330 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063354 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063376 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05cf94b4-4884-4e05-9036-3676fb8aedcb-log-httpd\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.063820 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05cf94b4-4884-4e05-9036-3676fb8aedcb-log-httpd\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.066883 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05cf94b4-4884-4e05-9036-3676fb8aedcb-run-httpd\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.070865 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.071519 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-scripts\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.071563 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-config-data\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.076986 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.079981 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cf94b4-4884-4e05-9036-3676fb8aedcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.082836 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knwn2\" (UniqueName: \"kubernetes.io/projected/05cf94b4-4884-4e05-9036-3676fb8aedcb-kube-api-access-knwn2\") pod \"ceilometer-0\" (UID: \"05cf94b4-4884-4e05-9036-3676fb8aedcb\") " pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.203596 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.347106 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdbbbb6-18f9-4dc6-bb9e-46e966c6b246" path="/var/lib/kubelet/pods/abdbbbb6-18f9-4dc6-bb9e-46e966c6b246/volumes" Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.602778 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 25 05:56:31 crc kubenswrapper[4728]: W0125 05:56:31.610646 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05cf94b4_4884_4e05_9036_3676fb8aedcb.slice/crio-89c6386bf24faa629f03ef9782d3ca8f3faf42e74d696bc690cedd61ba19fb63 WatchSource:0}: Error finding container 89c6386bf24faa629f03ef9782d3ca8f3faf42e74d696bc690cedd61ba19fb63: Status 404 returned error can't find the container with id 89c6386bf24faa629f03ef9782d3ca8f3faf42e74d696bc690cedd61ba19fb63 Jan 25 05:56:31 crc kubenswrapper[4728]: I0125 05:56:31.840201 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05cf94b4-4884-4e05-9036-3676fb8aedcb","Type":"ContainerStarted","Data":"89c6386bf24faa629f03ef9782d3ca8f3faf42e74d696bc690cedd61ba19fb63"} Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.256383 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.319223 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64fdb9bc75-sh8qj"] Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.319453 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" podUID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" containerName="dnsmasq-dns" containerID="cri-o://00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae" gracePeriod=10 Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.825813 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.849515 4728 generic.go:334] "Generic (PLEG): container finished" podID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" containerID="00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae" exitCode=0 Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.849594 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" event={"ID":"33e5b2d8-3161-4b8e-b40a-64b3c6c09138","Type":"ContainerDied","Data":"00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae"} Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.849633 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" event={"ID":"33e5b2d8-3161-4b8e-b40a-64b3c6c09138","Type":"ContainerDied","Data":"9d9ebc4afe57d7a198389504197e49c8971611eebaad7a3f147ce8fa4a786016"} Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.849657 4728 scope.go:117] "RemoveContainer" containerID="00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae" Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.849807 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fdb9bc75-sh8qj" Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.852568 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05cf94b4-4884-4e05-9036-3676fb8aedcb","Type":"ContainerStarted","Data":"c7763e10d94ce3dc983c594bf81e3433644880e324132463ccde0003ed17134d"} Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.905413 4728 scope.go:117] "RemoveContainer" containerID="7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987" Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.947123 4728 scope.go:117] "RemoveContainer" containerID="00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae" Jan 25 05:56:32 crc kubenswrapper[4728]: E0125 05:56:32.947725 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae\": container with ID starting with 00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae not found: ID does not exist" containerID="00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae" Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.947790 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae"} err="failed to get container status \"00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae\": rpc error: code = NotFound desc = could not find container \"00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae\": container with ID starting with 00b6623d235e9971e85da453461c4e4ada1131745daa7b0848a628065b9d85ae not found: ID does not exist" Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.947815 4728 scope.go:117] "RemoveContainer" containerID="7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987" Jan 25 05:56:32 crc kubenswrapper[4728]: E0125 05:56:32.948103 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987\": container with ID starting with 7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987 not found: ID does not exist" containerID="7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987" Jan 25 05:56:32 crc kubenswrapper[4728]: I0125 05:56:32.948125 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987"} err="failed to get container status \"7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987\": rpc error: code = NotFound desc = could not find container \"7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987\": container with ID starting with 7f1f672009c493ff93db9f88b0442ec406a9a2068faf57f40ddbe31dfba02987 not found: ID does not exist" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.009951 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-sb\") pod \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.010148 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-config\") pod \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.010251 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-nb\") pod \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.010275 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-svc\") pod \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.010367 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-swift-storage-0\") pod \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.010435 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2qjm\" (UniqueName: \"kubernetes.io/projected/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-kube-api-access-f2qjm\") pod \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\" (UID: \"33e5b2d8-3161-4b8e-b40a-64b3c6c09138\") " Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.016509 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-kube-api-access-f2qjm" (OuterVolumeSpecName: "kube-api-access-f2qjm") pod "33e5b2d8-3161-4b8e-b40a-64b3c6c09138" (UID: "33e5b2d8-3161-4b8e-b40a-64b3c6c09138"). InnerVolumeSpecName "kube-api-access-f2qjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.046471 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33e5b2d8-3161-4b8e-b40a-64b3c6c09138" (UID: "33e5b2d8-3161-4b8e-b40a-64b3c6c09138"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.048561 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33e5b2d8-3161-4b8e-b40a-64b3c6c09138" (UID: "33e5b2d8-3161-4b8e-b40a-64b3c6c09138"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.048622 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-config" (OuterVolumeSpecName: "config") pod "33e5b2d8-3161-4b8e-b40a-64b3c6c09138" (UID: "33e5b2d8-3161-4b8e-b40a-64b3c6c09138"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.051342 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33e5b2d8-3161-4b8e-b40a-64b3c6c09138" (UID: "33e5b2d8-3161-4b8e-b40a-64b3c6c09138"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.051966 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33e5b2d8-3161-4b8e-b40a-64b3c6c09138" (UID: "33e5b2d8-3161-4b8e-b40a-64b3c6c09138"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.120929 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.120976 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.120990 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.121001 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2qjm\" (UniqueName: \"kubernetes.io/projected/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-kube-api-access-f2qjm\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.121013 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.121023 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e5b2d8-3161-4b8e-b40a-64b3c6c09138-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.184960 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64fdb9bc75-sh8qj"] Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.192579 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64fdb9bc75-sh8qj"] Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.338417 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" path="/var/lib/kubelet/pods/33e5b2d8-3161-4b8e-b40a-64b3c6c09138/volumes" Jan 25 05:56:33 crc kubenswrapper[4728]: E0125 05:56:33.376062 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33e5b2d8_3161_4b8e_b40a_64b3c6c09138.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33e5b2d8_3161_4b8e_b40a_64b3c6c09138.slice/crio-9d9ebc4afe57d7a198389504197e49c8971611eebaad7a3f147ce8fa4a786016\": RecentStats: unable to find data in memory cache]" Jan 25 05:56:33 crc kubenswrapper[4728]: I0125 05:56:33.866051 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05cf94b4-4884-4e05-9036-3676fb8aedcb","Type":"ContainerStarted","Data":"7d36048a5d11fac08cdf62ca66a4b37648a46d9de31c98b344abc17383419b9e"} Jan 25 05:56:34 crc kubenswrapper[4728]: I0125 05:56:34.878522 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05cf94b4-4884-4e05-9036-3676fb8aedcb","Type":"ContainerStarted","Data":"4bccfdf6899930d9cd7a7757dad90d1f4d56280e50a67bf40c4766a62cedfa75"} Jan 25 05:56:35 crc kubenswrapper[4728]: I0125 05:56:35.407639 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:35 crc kubenswrapper[4728]: I0125 05:56:35.423386 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:35 crc kubenswrapper[4728]: I0125 05:56:35.892800 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05cf94b4-4884-4e05-9036-3676fb8aedcb","Type":"ContainerStarted","Data":"957deb79840a9f4bca3609e7261b9e2c803bc78257e9167b507cb690f4a8c6bd"} Jan 25 05:56:35 crc kubenswrapper[4728]: I0125 05:56:35.893155 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 25 05:56:35 crc kubenswrapper[4728]: I0125 05:56:35.917893 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 25 05:56:35 crc kubenswrapper[4728]: I0125 05:56:35.919746 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.418220909 podStartE2EDuration="5.919730477s" podCreationTimestamp="2026-01-25 05:56:30 +0000 UTC" firstStartedPulling="2026-01-25 05:56:31.613254533 +0000 UTC m=+1082.649132514" lastFinishedPulling="2026-01-25 05:56:35.114764103 +0000 UTC m=+1086.150642082" observedRunningTime="2026-01-25 05:56:35.908846682 +0000 UTC m=+1086.944724662" watchObservedRunningTime="2026-01-25 05:56:35.919730477 +0000 UTC m=+1086.955608457" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.049892 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-h9fs9"] Jan 25 05:56:36 crc kubenswrapper[4728]: E0125 05:56:36.050227 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" containerName="init" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.050245 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" containerName="init" Jan 25 05:56:36 crc kubenswrapper[4728]: E0125 05:56:36.050269 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" containerName="dnsmasq-dns" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.050276 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" containerName="dnsmasq-dns" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.050451 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e5b2d8-3161-4b8e-b40a-64b3c6c09138" containerName="dnsmasq-dns" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.050997 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.052838 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.053288 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.063609 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-h9fs9"] Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.184932 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhz89\" (UniqueName: \"kubernetes.io/projected/6804529a-198c-458a-98a3-4bcb6685b74c-kube-api-access-zhz89\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.185033 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-config-data\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.185078 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.185130 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-scripts\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.287148 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-config-data\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.287209 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.287333 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-scripts\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.287489 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhz89\" (UniqueName: \"kubernetes.io/projected/6804529a-198c-458a-98a3-4bcb6685b74c-kube-api-access-zhz89\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.294189 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-config-data\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.294385 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.299881 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-scripts\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.302595 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhz89\" (UniqueName: \"kubernetes.io/projected/6804529a-198c-458a-98a3-4bcb6685b74c-kube-api-access-zhz89\") pod \"nova-cell1-cell-mapping-h9fs9\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.364220 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.773815 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-h9fs9"] Jan 25 05:56:36 crc kubenswrapper[4728]: W0125 05:56:36.776671 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6804529a_198c_458a_98a3_4bcb6685b74c.slice/crio-ca053b20a964ef413503cf012743af709f9c340cc34126a20a3d80b144b9eff8 WatchSource:0}: Error finding container ca053b20a964ef413503cf012743af709f9c340cc34126a20a3d80b144b9eff8: Status 404 returned error can't find the container with id ca053b20a964ef413503cf012743af709f9c340cc34126a20a3d80b144b9eff8 Jan 25 05:56:36 crc kubenswrapper[4728]: I0125 05:56:36.900164 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h9fs9" event={"ID":"6804529a-198c-458a-98a3-4bcb6685b74c","Type":"ContainerStarted","Data":"ca053b20a964ef413503cf012743af709f9c340cc34126a20a3d80b144b9eff8"} Jan 25 05:56:37 crc kubenswrapper[4728]: I0125 05:56:37.907868 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h9fs9" event={"ID":"6804529a-198c-458a-98a3-4bcb6685b74c","Type":"ContainerStarted","Data":"78343fd1c215f9b50d40a96050efe3cac79744b37724c31f928ae1c4ff10009d"} Jan 25 05:56:37 crc kubenswrapper[4728]: I0125 05:56:37.924163 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-h9fs9" podStartSLOduration=1.924147322 podStartE2EDuration="1.924147322s" podCreationTimestamp="2026-01-25 05:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:37.923444507 +0000 UTC m=+1088.959322486" watchObservedRunningTime="2026-01-25 05:56:37.924147322 +0000 UTC m=+1088.960025302" Jan 25 05:56:39 crc kubenswrapper[4728]: I0125 05:56:39.195173 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 25 05:56:39 crc kubenswrapper[4728]: I0125 05:56:39.195596 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 25 05:56:40 crc kubenswrapper[4728]: I0125 05:56:40.208459 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:40 crc kubenswrapper[4728]: I0125 05:56:40.208459 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:41 crc kubenswrapper[4728]: I0125 05:56:41.940242 4728 generic.go:334] "Generic (PLEG): container finished" podID="6804529a-198c-458a-98a3-4bcb6685b74c" containerID="78343fd1c215f9b50d40a96050efe3cac79744b37724c31f928ae1c4ff10009d" exitCode=0 Jan 25 05:56:41 crc kubenswrapper[4728]: I0125 05:56:41.940361 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h9fs9" event={"ID":"6804529a-198c-458a-98a3-4bcb6685b74c","Type":"ContainerDied","Data":"78343fd1c215f9b50d40a96050efe3cac79744b37724c31f928ae1c4ff10009d"} Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.241737 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.424437 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhz89\" (UniqueName: \"kubernetes.io/projected/6804529a-198c-458a-98a3-4bcb6685b74c-kube-api-access-zhz89\") pod \"6804529a-198c-458a-98a3-4bcb6685b74c\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.424625 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-config-data\") pod \"6804529a-198c-458a-98a3-4bcb6685b74c\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.426793 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-combined-ca-bundle\") pod \"6804529a-198c-458a-98a3-4bcb6685b74c\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.426946 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-scripts\") pod \"6804529a-198c-458a-98a3-4bcb6685b74c\" (UID: \"6804529a-198c-458a-98a3-4bcb6685b74c\") " Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.434446 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6804529a-198c-458a-98a3-4bcb6685b74c-kube-api-access-zhz89" (OuterVolumeSpecName: "kube-api-access-zhz89") pod "6804529a-198c-458a-98a3-4bcb6685b74c" (UID: "6804529a-198c-458a-98a3-4bcb6685b74c"). InnerVolumeSpecName "kube-api-access-zhz89". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.437191 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-scripts" (OuterVolumeSpecName: "scripts") pod "6804529a-198c-458a-98a3-4bcb6685b74c" (UID: "6804529a-198c-458a-98a3-4bcb6685b74c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.449741 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6804529a-198c-458a-98a3-4bcb6685b74c" (UID: "6804529a-198c-458a-98a3-4bcb6685b74c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.455533 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-config-data" (OuterVolumeSpecName: "config-data") pod "6804529a-198c-458a-98a3-4bcb6685b74c" (UID: "6804529a-198c-458a-98a3-4bcb6685b74c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.529207 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.529238 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-scripts\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.529248 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhz89\" (UniqueName: \"kubernetes.io/projected/6804529a-198c-458a-98a3-4bcb6685b74c-kube-api-access-zhz89\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.529260 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6804529a-198c-458a-98a3-4bcb6685b74c-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.956469 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h9fs9" event={"ID":"6804529a-198c-458a-98a3-4bcb6685b74c","Type":"ContainerDied","Data":"ca053b20a964ef413503cf012743af709f9c340cc34126a20a3d80b144b9eff8"} Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.956521 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h9fs9" Jan 25 05:56:43 crc kubenswrapper[4728]: I0125 05:56:43.956528 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca053b20a964ef413503cf012743af709f9c340cc34126a20a3d80b144b9eff8" Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.117052 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.117300 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-log" containerID="cri-o://6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1" gracePeriod=30 Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.117401 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-api" containerID="cri-o://1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83" gracePeriod=30 Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.128049 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.128245 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8fed3aed-aad1-4995-95b5-fe247410707e" containerName="nova-scheduler-scheduler" containerID="cri-o://fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848" gracePeriod=30 Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.136227 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.136507 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-log" containerID="cri-o://2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59" gracePeriod=30 Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.136583 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-metadata" containerID="cri-o://3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3" gracePeriod=30 Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.964992 4728 generic.go:334] "Generic (PLEG): container finished" podID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerID="6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1" exitCode=143 Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.965058 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26","Type":"ContainerDied","Data":"6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1"} Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.966729 4728 generic.go:334] "Generic (PLEG): container finished" podID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerID="2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59" exitCode=143 Jan 25 05:56:44 crc kubenswrapper[4728]: I0125 05:56:44.966787 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254","Type":"ContainerDied","Data":"2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59"} Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.525998 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.669497 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-combined-ca-bundle\") pod \"8fed3aed-aad1-4995-95b5-fe247410707e\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.669549 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-config-data\") pod \"8fed3aed-aad1-4995-95b5-fe247410707e\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.669575 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ltxr\" (UniqueName: \"kubernetes.io/projected/8fed3aed-aad1-4995-95b5-fe247410707e-kube-api-access-8ltxr\") pod \"8fed3aed-aad1-4995-95b5-fe247410707e\" (UID: \"8fed3aed-aad1-4995-95b5-fe247410707e\") " Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.679705 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fed3aed-aad1-4995-95b5-fe247410707e-kube-api-access-8ltxr" (OuterVolumeSpecName: "kube-api-access-8ltxr") pod "8fed3aed-aad1-4995-95b5-fe247410707e" (UID: "8fed3aed-aad1-4995-95b5-fe247410707e"). InnerVolumeSpecName "kube-api-access-8ltxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.694458 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fed3aed-aad1-4995-95b5-fe247410707e" (UID: "8fed3aed-aad1-4995-95b5-fe247410707e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.698719 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-config-data" (OuterVolumeSpecName: "config-data") pod "8fed3aed-aad1-4995-95b5-fe247410707e" (UID: "8fed3aed-aad1-4995-95b5-fe247410707e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.771623 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.771657 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fed3aed-aad1-4995-95b5-fe247410707e-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.771668 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ltxr\" (UniqueName: \"kubernetes.io/projected/8fed3aed-aad1-4995-95b5-fe247410707e-kube-api-access-8ltxr\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.976309 4728 generic.go:334] "Generic (PLEG): container finished" podID="8fed3aed-aad1-4995-95b5-fe247410707e" containerID="fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848" exitCode=0 Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.976386 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.976388 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8fed3aed-aad1-4995-95b5-fe247410707e","Type":"ContainerDied","Data":"fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848"} Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.976533 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8fed3aed-aad1-4995-95b5-fe247410707e","Type":"ContainerDied","Data":"9a3a652300ff57c3c65252f738657da4aebd48a46562b4eeaed95e1b370eddd7"} Jan 25 05:56:45 crc kubenswrapper[4728]: I0125 05:56:45.976566 4728 scope.go:117] "RemoveContainer" containerID="fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.004744 4728 scope.go:117] "RemoveContainer" containerID="fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.005065 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:46 crc kubenswrapper[4728]: E0125 05:56:46.005429 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848\": container with ID starting with fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848 not found: ID does not exist" containerID="fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.005472 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848"} err="failed to get container status \"fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848\": rpc error: code = NotFound desc = could not find container \"fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848\": container with ID starting with fe7d421f89822d1dec5170ac4e2e750c57f70b521fd2b3128845a2b5868ee848 not found: ID does not exist" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.011243 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.016094 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:46 crc kubenswrapper[4728]: E0125 05:56:46.016408 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6804529a-198c-458a-98a3-4bcb6685b74c" containerName="nova-manage" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.016426 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6804529a-198c-458a-98a3-4bcb6685b74c" containerName="nova-manage" Jan 25 05:56:46 crc kubenswrapper[4728]: E0125 05:56:46.016465 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fed3aed-aad1-4995-95b5-fe247410707e" containerName="nova-scheduler-scheduler" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.016473 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fed3aed-aad1-4995-95b5-fe247410707e" containerName="nova-scheduler-scheduler" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.016619 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fed3aed-aad1-4995-95b5-fe247410707e" containerName="nova-scheduler-scheduler" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.016640 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6804529a-198c-458a-98a3-4bcb6685b74c" containerName="nova-manage" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.017150 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.019807 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.028438 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.178564 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-config-data\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.178657 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64dl6\" (UniqueName: \"kubernetes.io/projected/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-kube-api-access-64dl6\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.178697 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.279599 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.279703 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-config-data\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.279766 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64dl6\" (UniqueName: \"kubernetes.io/projected/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-kube-api-access-64dl6\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.286150 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.287065 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-config-data\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.294933 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64dl6\" (UniqueName: \"kubernetes.io/projected/086fe2f2-83d2-440c-bcde-f3d1bf8f21c8-kube-api-access-64dl6\") pod \"nova-scheduler-0\" (UID: \"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8\") " pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.331662 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.731920 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.985361 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8","Type":"ContainerStarted","Data":"771cd9d3619e345d0c28a928c6f68583de92a2c6423383d8314b515c0ea132f2"} Jan 25 05:56:46 crc kubenswrapper[4728]: I0125 05:56:46.985647 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"086fe2f2-83d2-440c-bcde-f3d1bf8f21c8","Type":"ContainerStarted","Data":"0424ccf573cea039253d3d74ce3e5d805576e695ae8ef9d91c13b0f3441f9a2a"} Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.008227 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.008211011 podStartE2EDuration="1.008211011s" podCreationTimestamp="2026-01-25 05:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:46.999952547 +0000 UTC m=+1098.035830527" watchObservedRunningTime="2026-01-25 05:56:47.008211011 +0000 UTC m=+1098.044088991" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.261219 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:48242->10.217.0.192:8775: read: connection reset by peer" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.261248 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:48244->10.217.0.192:8775: read: connection reset by peer" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.337699 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fed3aed-aad1-4995-95b5-fe247410707e" path="/var/lib/kubelet/pods/8fed3aed-aad1-4995-95b5-fe247410707e/volumes" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.713950 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.716273 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914153 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dstf7\" (UniqueName: \"kubernetes.io/projected/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-kube-api-access-dstf7\") pod \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914208 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-internal-tls-certs\") pod \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914248 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-public-tls-certs\") pod \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914310 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-logs\") pod \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914418 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-config-data\") pod \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914486 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4lq2\" (UniqueName: \"kubernetes.io/projected/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-kube-api-access-p4lq2\") pod \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914518 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-nova-metadata-tls-certs\") pod \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914602 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-logs\") pod \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914686 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-config-data\") pod \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914719 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-combined-ca-bundle\") pod \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\" (UID: \"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914749 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-combined-ca-bundle\") pod \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\" (UID: \"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254\") " Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.914856 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-logs" (OuterVolumeSpecName: "logs") pod "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" (UID: "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.915266 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.916515 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-logs" (OuterVolumeSpecName: "logs") pod "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" (UID: "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.921566 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-kube-api-access-p4lq2" (OuterVolumeSpecName: "kube-api-access-p4lq2") pod "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" (UID: "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26"). InnerVolumeSpecName "kube-api-access-p4lq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.922969 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-kube-api-access-dstf7" (OuterVolumeSpecName: "kube-api-access-dstf7") pod "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" (UID: "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254"). InnerVolumeSpecName "kube-api-access-dstf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.944633 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" (UID: "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.945027 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" (UID: "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.956085 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-config-data" (OuterVolumeSpecName: "config-data") pod "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" (UID: "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.956069 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-config-data" (OuterVolumeSpecName: "config-data") pod "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" (UID: "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.964907 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" (UID: "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.976810 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" (UID: "eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:47 crc kubenswrapper[4728]: I0125 05:56:47.979580 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" (UID: "8ee641d6-fd82-4049-9e2f-5f2d9f2f6254"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.000208 4728 generic.go:334] "Generic (PLEG): container finished" podID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerID="1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83" exitCode=0 Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.000289 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26","Type":"ContainerDied","Data":"1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83"} Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.000338 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26","Type":"ContainerDied","Data":"ec7cdf8bd3e48c75e0d01012424ae314167706d826bee1cf76b704e581069484"} Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.000359 4728 scope.go:117] "RemoveContainer" containerID="1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.000308 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.006047 4728 generic.go:334] "Generic (PLEG): container finished" podID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerID="3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3" exitCode=0 Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.006144 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254","Type":"ContainerDied","Data":"3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3"} Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.006134 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.006252 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ee641d6-fd82-4049-9e2f-5f2d9f2f6254","Type":"ContainerDied","Data":"2d35a9ff10e4750efebcb5b1a01c2b067eb9a12193c7a9070fc3893d443ae8ee"} Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020112 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dstf7\" (UniqueName: \"kubernetes.io/projected/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-kube-api-access-dstf7\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020147 4728 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020160 4728 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020173 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020185 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4lq2\" (UniqueName: \"kubernetes.io/projected/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-kube-api-access-p4lq2\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020199 4728 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020209 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-logs\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020223 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020233 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.020242 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.024761 4728 scope.go:117] "RemoveContainer" containerID="6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.057563 4728 scope.go:117] "RemoveContainer" containerID="1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83" Jan 25 05:56:48 crc kubenswrapper[4728]: E0125 05:56:48.064468 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83\": container with ID starting with 1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83 not found: ID does not exist" containerID="1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.064496 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.064525 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83"} err="failed to get container status \"1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83\": rpc error: code = NotFound desc = could not find container \"1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83\": container with ID starting with 1b2260bd079838682ea51109f94dffc4124a364a4c792d08c681e9f349efed83 not found: ID does not exist" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.064571 4728 scope.go:117] "RemoveContainer" containerID="6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1" Jan 25 05:56:48 crc kubenswrapper[4728]: E0125 05:56:48.068265 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1\": container with ID starting with 6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1 not found: ID does not exist" containerID="6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.068291 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1"} err="failed to get container status \"6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1\": rpc error: code = NotFound desc = could not find container \"6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1\": container with ID starting with 6015c457fd05444e5e27ddba43d2e82a9ad5d1720a53a68b052a45630dca81a1 not found: ID does not exist" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.068313 4728 scope.go:117] "RemoveContainer" containerID="3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.081594 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.093009 4728 scope.go:117] "RemoveContainer" containerID="2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.097653 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.115039 4728 scope.go:117] "RemoveContainer" containerID="3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3" Jan 25 05:56:48 crc kubenswrapper[4728]: E0125 05:56:48.115365 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3\": container with ID starting with 3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3 not found: ID does not exist" containerID="3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.115387 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3"} err="failed to get container status \"3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3\": rpc error: code = NotFound desc = could not find container \"3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3\": container with ID starting with 3911619adf7fa7b61f074303009cf07fb0ab6315e437d53a329f191da71de3b3 not found: ID does not exist" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.115403 4728 scope.go:117] "RemoveContainer" containerID="2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59" Jan 25 05:56:48 crc kubenswrapper[4728]: E0125 05:56:48.115597 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59\": container with ID starting with 2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59 not found: ID does not exist" containerID="2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.115613 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59"} err="failed to get container status \"2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59\": rpc error: code = NotFound desc = could not find container \"2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59\": container with ID starting with 2dec04a0d45ff78c0cc60cc16c8dba22081e98973ac6942737798d031a144d59 not found: ID does not exist" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.116130 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122106 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: E0125 05:56:48.122666 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-log" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122685 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-log" Jan 25 05:56:48 crc kubenswrapper[4728]: E0125 05:56:48.122702 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-metadata" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122709 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-metadata" Jan 25 05:56:48 crc kubenswrapper[4728]: E0125 05:56:48.122717 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-api" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122724 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-api" Jan 25 05:56:48 crc kubenswrapper[4728]: E0125 05:56:48.122733 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-log" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122738 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-log" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122897 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-log" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122920 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" containerName="nova-metadata-metadata" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122929 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-log" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.122936 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" containerName="nova-api-api" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.123811 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.127566 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.127696 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.128526 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.136534 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.137759 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.140431 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.140579 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.140755 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.140976 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224491 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224621 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-config-data\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224672 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56aa160e-7328-465e-8908-a78bb2fc8364-logs\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224700 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224733 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-config-data\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224753 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-public-tls-certs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224776 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224849 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224905 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j69d\" (UniqueName: \"kubernetes.io/projected/56aa160e-7328-465e-8908-a78bb2fc8364-kube-api-access-9j69d\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.224997 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl7bc\" (UniqueName: \"kubernetes.io/projected/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-kube-api-access-nl7bc\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.225056 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-logs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.326767 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-config-data\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.326816 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-public-tls-certs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.326839 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.326861 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.326891 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j69d\" (UniqueName: \"kubernetes.io/projected/56aa160e-7328-465e-8908-a78bb2fc8364-kube-api-access-9j69d\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.326919 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl7bc\" (UniqueName: \"kubernetes.io/projected/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-kube-api-access-nl7bc\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.326945 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-logs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.327002 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.327053 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-config-data\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.327076 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56aa160e-7328-465e-8908-a78bb2fc8364-logs\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.327093 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.327912 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56aa160e-7328-465e-8908-a78bb2fc8364-logs\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.327931 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-logs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.334077 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-config-data\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.334376 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-public-tls-certs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.336894 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-config-data\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.337019 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.337940 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.337967 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.338350 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56aa160e-7328-465e-8908-a78bb2fc8364-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.341878 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j69d\" (UniqueName: \"kubernetes.io/projected/56aa160e-7328-465e-8908-a78bb2fc8364-kube-api-access-9j69d\") pod \"nova-metadata-0\" (UID: \"56aa160e-7328-465e-8908-a78bb2fc8364\") " pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.343047 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl7bc\" (UniqueName: \"kubernetes.io/projected/bf4fc010-98c5-4734-a9c9-3de4f1d1a34b-kube-api-access-nl7bc\") pod \"nova-api-0\" (UID: \"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b\") " pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.442907 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.458173 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.865646 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 25 05:56:48 crc kubenswrapper[4728]: I0125 05:56:48.909650 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 25 05:56:49 crc kubenswrapper[4728]: I0125 05:56:49.019184 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b","Type":"ContainerStarted","Data":"de160952e2d982d4ea20dc4551e7a69575e3f571c3b4fea89263f2019b41534f"} Jan 25 05:56:49 crc kubenswrapper[4728]: I0125 05:56:49.025975 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aa160e-7328-465e-8908-a78bb2fc8364","Type":"ContainerStarted","Data":"f8d7444528ae1bf7c0cef29b2d3b482ece15654633ac28650668bd56572556a9"} Jan 25 05:56:49 crc kubenswrapper[4728]: I0125 05:56:49.342205 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee641d6-fd82-4049-9e2f-5f2d9f2f6254" path="/var/lib/kubelet/pods/8ee641d6-fd82-4049-9e2f-5f2d9f2f6254/volumes" Jan 25 05:56:49 crc kubenswrapper[4728]: I0125 05:56:49.342995 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26" path="/var/lib/kubelet/pods/eae4d6df-c0a0-400d-85c6-8c6d7e6c4a26/volumes" Jan 25 05:56:50 crc kubenswrapper[4728]: I0125 05:56:50.035385 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b","Type":"ContainerStarted","Data":"fd969e0cf242ccd69ac852d3ff6e57192b2968294afc611061afde36ed8325ca"} Jan 25 05:56:50 crc kubenswrapper[4728]: I0125 05:56:50.035434 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4fc010-98c5-4734-a9c9-3de4f1d1a34b","Type":"ContainerStarted","Data":"694b0307ca4bf81fbeb3d5a57b1fd06070facd84cc588ef7dc6bd483bd1fb4c4"} Jan 25 05:56:50 crc kubenswrapper[4728]: I0125 05:56:50.037057 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aa160e-7328-465e-8908-a78bb2fc8364","Type":"ContainerStarted","Data":"bb0865fc2ca7f188dfe0157dfe5dfe43ad1f0415fe56f98b77407a263779947a"} Jan 25 05:56:50 crc kubenswrapper[4728]: I0125 05:56:50.037476 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aa160e-7328-465e-8908-a78bb2fc8364","Type":"ContainerStarted","Data":"da58af1df1e1483ec618686e1265dc1c60db9e82cf6584108eec64fa092b368c"} Jan 25 05:56:50 crc kubenswrapper[4728]: I0125 05:56:50.060443 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.060425571 podStartE2EDuration="2.060425571s" podCreationTimestamp="2026-01-25 05:56:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:50.050295548 +0000 UTC m=+1101.086173528" watchObservedRunningTime="2026-01-25 05:56:50.060425571 +0000 UTC m=+1101.096303552" Jan 25 05:56:50 crc kubenswrapper[4728]: I0125 05:56:50.070803 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.070779647 podStartE2EDuration="2.070779647s" podCreationTimestamp="2026-01-25 05:56:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:56:50.066831672 +0000 UTC m=+1101.102709652" watchObservedRunningTime="2026-01-25 05:56:50.070779647 +0000 UTC m=+1101.106657626" Jan 25 05:56:51 crc kubenswrapper[4728]: I0125 05:56:51.337561 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 25 05:56:53 crc kubenswrapper[4728]: I0125 05:56:53.444538 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 25 05:56:53 crc kubenswrapper[4728]: I0125 05:56:53.444929 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 25 05:56:56 crc kubenswrapper[4728]: I0125 05:56:56.332123 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 25 05:56:56 crc kubenswrapper[4728]: I0125 05:56:56.358428 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 25 05:56:57 crc kubenswrapper[4728]: I0125 05:56:57.119597 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 25 05:56:58 crc kubenswrapper[4728]: I0125 05:56:58.444628 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 25 05:56:58 crc kubenswrapper[4728]: I0125 05:56:58.444781 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 25 05:56:58 crc kubenswrapper[4728]: I0125 05:56:58.458690 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 25 05:56:58 crc kubenswrapper[4728]: I0125 05:56:58.458730 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 25 05:56:59 crc kubenswrapper[4728]: I0125 05:56:59.458446 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="56aa160e-7328-465e-8908-a78bb2fc8364" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:59 crc kubenswrapper[4728]: I0125 05:56:59.458512 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="56aa160e-7328-465e-8908-a78bb2fc8364" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:59 crc kubenswrapper[4728]: I0125 05:56:59.469462 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf4fc010-98c5-4734-a9c9-3de4f1d1a34b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 25 05:56:59 crc kubenswrapper[4728]: I0125 05:56:59.469461 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf4fc010-98c5-4734-a9c9-3de4f1d1a34b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 25 05:57:01 crc kubenswrapper[4728]: I0125 05:57:01.213082 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 25 05:57:08 crc kubenswrapper[4728]: I0125 05:57:08.449305 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 25 05:57:08 crc kubenswrapper[4728]: I0125 05:57:08.450286 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 25 05:57:08 crc kubenswrapper[4728]: I0125 05:57:08.457189 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 25 05:57:08 crc kubenswrapper[4728]: I0125 05:57:08.473498 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 25 05:57:08 crc kubenswrapper[4728]: I0125 05:57:08.473789 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 25 05:57:08 crc kubenswrapper[4728]: I0125 05:57:08.474727 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 25 05:57:08 crc kubenswrapper[4728]: I0125 05:57:08.480156 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 25 05:57:09 crc kubenswrapper[4728]: I0125 05:57:09.205331 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 25 05:57:09 crc kubenswrapper[4728]: I0125 05:57:09.212847 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 25 05:57:09 crc kubenswrapper[4728]: I0125 05:57:09.214470 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 25 05:57:15 crc kubenswrapper[4728]: I0125 05:57:15.521602 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:57:16 crc kubenswrapper[4728]: I0125 05:57:16.219699 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:57:19 crc kubenswrapper[4728]: I0125 05:57:19.456206 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerName="rabbitmq" containerID="cri-o://1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77" gracePeriod=604797 Jan 25 05:57:19 crc kubenswrapper[4728]: I0125 05:57:19.706187 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Jan 25 05:57:19 crc kubenswrapper[4728]: I0125 05:57:19.878242 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerName="rabbitmq" containerID="cri-o://14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1" gracePeriod=604797 Jan 25 05:57:19 crc kubenswrapper[4728]: I0125 05:57:19.984029 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Jan 25 05:57:25 crc kubenswrapper[4728]: I0125 05:57:25.986147 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166094 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52qg8\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-kube-api-access-52qg8\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166150 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-plugins\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166264 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-config-data\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166353 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-server-conf\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166388 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ddd3d99e-20c0-4133-9537-413f83a04edb-erlang-cookie-secret\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166413 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-tls\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166432 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-erlang-cookie\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166467 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-confd\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166580 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166631 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-plugins-conf\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.166653 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ddd3d99e-20c0-4133-9537-413f83a04edb-pod-info\") pod \"ddd3d99e-20c0-4133-9537-413f83a04edb\" (UID: \"ddd3d99e-20c0-4133-9537-413f83a04edb\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.170658 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.172579 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ddd3d99e-20c0-4133-9537-413f83a04edb-pod-info" (OuterVolumeSpecName: "pod-info") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.179520 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.181583 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.186363 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.192269 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-kube-api-access-52qg8" (OuterVolumeSpecName: "kube-api-access-52qg8") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "kube-api-access-52qg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.193660 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd3d99e-20c0-4133-9537-413f83a04edb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.202676 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-config-data" (OuterVolumeSpecName: "config-data") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.205966 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.222687 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-server-conf" (OuterVolumeSpecName: "server-conf") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.253947 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ddd3d99e-20c0-4133-9537-413f83a04edb" (UID: "ddd3d99e-20c0-4133-9537-413f83a04edb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.284719 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.284758 4728 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.284889 4728 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ddd3d99e-20c0-4133-9537-413f83a04edb-pod-info\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.284906 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52qg8\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-kube-api-access-52qg8\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.285127 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.285138 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.285148 4728 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ddd3d99e-20c0-4133-9537-413f83a04edb-server-conf\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.285159 4728 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ddd3d99e-20c0-4133-9537-413f83a04edb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.285169 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.285204 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.285214 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ddd3d99e-20c0-4133-9537-413f83a04edb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.302271 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.323896 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.368482 4728 generic.go:334] "Generic (PLEG): container finished" podID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerID="14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1" exitCode=0 Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.368541 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2","Type":"ContainerDied","Data":"14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1"} Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.368569 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2","Type":"ContainerDied","Data":"71097818bb7edee0d694a7e3e899c036a69ba9e86dc9332774d6cc69365c5367"} Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.368593 4728 scope.go:117] "RemoveContainer" containerID="14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.368722 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.378614 4728 generic.go:334] "Generic (PLEG): container finished" podID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerID="1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77" exitCode=0 Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.378650 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ddd3d99e-20c0-4133-9537-413f83a04edb","Type":"ContainerDied","Data":"1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77"} Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.378672 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ddd3d99e-20c0-4133-9537-413f83a04edb","Type":"ContainerDied","Data":"1c37604a4a59ee8a403c1782230219c6fca112ca4f3ba003e81afc4fd6307dc6"} Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.378721 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386229 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-plugins\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386349 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-server-conf\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386391 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-config-data\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386489 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfxd8\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-kube-api-access-rfxd8\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386512 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-pod-info\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386526 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386544 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-plugins-conf\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386606 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-erlang-cookie\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386629 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-tls\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386659 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-confd\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.386709 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-erlang-cookie-secret\") pod \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\" (UID: \"05af443f-cc1e-4f2c-bb6d-a11bc9647ce2\") " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.387387 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.387799 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.388169 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.392664 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.393671 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-kube-api-access-rfxd8" (OuterVolumeSpecName: "kube-api-access-rfxd8") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "kube-api-access-rfxd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.396879 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.396902 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfxd8\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-kube-api-access-rfxd8\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.396917 4728 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.396950 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.396959 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.396968 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.401648 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-pod-info" (OuterVolumeSpecName: "pod-info") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.418743 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.421692 4728 scope.go:117] "RemoveContainer" containerID="245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.425297 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-config-data" (OuterVolumeSpecName: "config-data") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.426787 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.430370 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.454513 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-server-conf" (OuterVolumeSpecName: "server-conf") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.463452 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.488153 4728 scope.go:117] "RemoveContainer" containerID="14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1" Jan 25 05:57:26 crc kubenswrapper[4728]: E0125 05:57:26.493614 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1\": container with ID starting with 14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1 not found: ID does not exist" containerID="14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.493655 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1"} err="failed to get container status \"14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1\": rpc error: code = NotFound desc = could not find container \"14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1\": container with ID starting with 14262fdf5ebaa1c71ce7cfe87bac9ba13f457d2e00cd27a9c338ac15d5b3b9c1 not found: ID does not exist" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.493678 4728 scope.go:117] "RemoveContainer" containerID="245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e" Jan 25 05:57:26 crc kubenswrapper[4728]: E0125 05:57:26.494438 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e\": container with ID starting with 245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e not found: ID does not exist" containerID="245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.494463 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e"} err="failed to get container status \"245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e\": rpc error: code = NotFound desc = could not find container \"245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e\": container with ID starting with 245a770cd343bd5b99adf3154252a26fa6659f9eebd097472878a9e4f0115b1e not found: ID does not exist" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.494476 4728 scope.go:117] "RemoveContainer" containerID="1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.496900 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:57:26 crc kubenswrapper[4728]: E0125 05:57:26.497288 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerName="rabbitmq" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.497305 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerName="rabbitmq" Jan 25 05:57:26 crc kubenswrapper[4728]: E0125 05:57:26.497331 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerName="setup-container" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.497338 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerName="setup-container" Jan 25 05:57:26 crc kubenswrapper[4728]: E0125 05:57:26.497361 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerName="rabbitmq" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.497367 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerName="rabbitmq" Jan 25 05:57:26 crc kubenswrapper[4728]: E0125 05:57:26.497377 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerName="setup-container" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.497383 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerName="setup-container" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.497529 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd3d99e-20c0-4133-9537-413f83a04edb" containerName="rabbitmq" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.497548 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" containerName="rabbitmq" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.498898 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.499472 4728 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-server-conf\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.499502 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.499512 4728 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-pod-info\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.499533 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.499543 4728 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.504072 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.504329 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.505543 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.505664 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.505810 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7x6cq" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.509199 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.509342 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.522877 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.535676 4728 scope.go:117] "RemoveContainer" containerID="c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.537064 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.582202 4728 scope.go:117] "RemoveContainer" containerID="1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77" Jan 25 05:57:26 crc kubenswrapper[4728]: E0125 05:57:26.582696 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77\": container with ID starting with 1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77 not found: ID does not exist" containerID="1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.582723 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77"} err="failed to get container status \"1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77\": rpc error: code = NotFound desc = could not find container \"1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77\": container with ID starting with 1a5eed0c728f22c3ef8fd35c36b97a47671658c6e316bfb5a229264df6e21e77 not found: ID does not exist" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.582746 4728 scope.go:117] "RemoveContainer" containerID="c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e" Jan 25 05:57:26 crc kubenswrapper[4728]: E0125 05:57:26.588620 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e\": container with ID starting with c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e not found: ID does not exist" containerID="c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.588647 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e"} err="failed to get container status \"c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e\": rpc error: code = NotFound desc = could not find container \"c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e\": container with ID starting with c58514f8d54b777acf1b26c17f0e93e007eef77fd866ab24c2b150ff42798f5e not found: ID does not exist" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.589481 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" (UID: "05af443f-cc1e-4f2c-bb6d-a11bc9647ce2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.608740 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.608808 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.608869 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n454j\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-kube-api-access-n454j\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.608900 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.608918 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.608949 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a9b861c-f271-4b2b-865e-925bf405c7d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.608979 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a9b861c-f271-4b2b-865e-925bf405c7d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.609002 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.609028 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.609055 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.609087 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.609151 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.609163 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.710901 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.710988 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711057 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n454j\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-kube-api-access-n454j\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711084 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711122 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711154 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a9b861c-f271-4b2b-865e-925bf405c7d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711201 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a9b861c-f271-4b2b-865e-925bf405c7d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711222 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711252 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711289 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711346 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.711931 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.713415 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.713513 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.714008 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.714654 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.714996 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a9b861c-f271-4b2b-865e-925bf405c7d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.718558 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a9b861c-f271-4b2b-865e-925bf405c7d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.718949 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a9b861c-f271-4b2b-865e-925bf405c7d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.720221 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.727386 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.730647 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n454j\" (UniqueName: \"kubernetes.io/projected/4a9b861c-f271-4b2b-865e-925bf405c7d1-kube-api-access-n454j\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.751805 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4a9b861c-f271-4b2b-865e-925bf405c7d1\") " pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.811912 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.821828 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.827442 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.830728 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.832460 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.834961 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.834979 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.834986 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-k9w2b" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.835050 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.835127 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.835173 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.835269 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.844293 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916169 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916210 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916257 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/718dab40-f0af-4030-8a9c-2a3a10aa4737-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916308 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5gm\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-kube-api-access-7j5gm\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916397 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916447 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916484 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916504 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916533 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916571 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:26 crc kubenswrapper[4728]: I0125 05:57:26.916589 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/718dab40-f0af-4030-8a9c-2a3a10aa4737-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018493 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018550 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018576 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018608 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018652 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018668 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/718dab40-f0af-4030-8a9c-2a3a10aa4737-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018721 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018742 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018768 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/718dab40-f0af-4030-8a9c-2a3a10aa4737-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018806 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5gm\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-kube-api-access-7j5gm\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018838 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.018977 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.019225 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.019971 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.020136 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.021956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.022127 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/718dab40-f0af-4030-8a9c-2a3a10aa4737-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.024694 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.024916 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.029286 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/718dab40-f0af-4030-8a9c-2a3a10aa4737-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.031093 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/718dab40-f0af-4030-8a9c-2a3a10aa4737-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.040175 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5gm\" (UniqueName: \"kubernetes.io/projected/718dab40-f0af-4030-8a9c-2a3a10aa4737-kube-api-access-7j5gm\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.054886 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"718dab40-f0af-4030-8a9c-2a3a10aa4737\") " pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.090281 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-664b757dcf-jt56d"] Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.092188 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.093642 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.103737 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-664b757dcf-jt56d"] Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.121171 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-sb\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.121224 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-openstack-edpm-ipam\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.121410 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-svc\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.121446 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-config\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.121470 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-swift-storage-0\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.121491 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-nb\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.121581 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wndr\" (UniqueName: \"kubernetes.io/projected/a14edcc9-69c2-4daf-b586-10ef2f436669-kube-api-access-8wndr\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.201173 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.223537 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-sb\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.223957 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-openstack-edpm-ipam\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.224029 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-svc\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.224059 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-config\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.224092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-swift-storage-0\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.224119 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-nb\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.224144 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wndr\" (UniqueName: \"kubernetes.io/projected/a14edcc9-69c2-4daf-b586-10ef2f436669-kube-api-access-8wndr\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.224517 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-sb\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.224949 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-openstack-edpm-ipam\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.225112 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-config\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.225139 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-svc\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.225710 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-nb\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.226219 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-swift-storage-0\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.240872 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wndr\" (UniqueName: \"kubernetes.io/projected/a14edcc9-69c2-4daf-b586-10ef2f436669-kube-api-access-8wndr\") pod \"dnsmasq-dns-664b757dcf-jt56d\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.243185 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.340211 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05af443f-cc1e-4f2c-bb6d-a11bc9647ce2" path="/var/lib/kubelet/pods/05af443f-cc1e-4f2c-bb6d-a11bc9647ce2/volumes" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.342475 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd3d99e-20c0-4133-9537-413f83a04edb" path="/var/lib/kubelet/pods/ddd3d99e-20c0-4133-9537-413f83a04edb/volumes" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.406042 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4a9b861c-f271-4b2b-865e-925bf405c7d1","Type":"ContainerStarted","Data":"c5cad78d778248293571bc87f3cf7c20a78e6c8044aa56f1f6ceb0c2c6db6936"} Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.418437 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.612488 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 25 05:57:27 crc kubenswrapper[4728]: I0125 05:57:27.819580 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-664b757dcf-jt56d"] Jan 25 05:57:27 crc kubenswrapper[4728]: W0125 05:57:27.872036 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda14edcc9_69c2_4daf_b586_10ef2f436669.slice/crio-f18961914ed4564d71c4a0e35376c08b050b478fa1017b25b07915fa197eaf59 WatchSource:0}: Error finding container f18961914ed4564d71c4a0e35376c08b050b478fa1017b25b07915fa197eaf59: Status 404 returned error can't find the container with id f18961914ed4564d71c4a0e35376c08b050b478fa1017b25b07915fa197eaf59 Jan 25 05:57:28 crc kubenswrapper[4728]: I0125 05:57:28.418430 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4a9b861c-f271-4b2b-865e-925bf405c7d1","Type":"ContainerStarted","Data":"2ff585cf4c4a5114a49cc29737fc13fd5eec409fbbe9b398c54f846eeb5f2d06"} Jan 25 05:57:28 crc kubenswrapper[4728]: I0125 05:57:28.429512 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" event={"ID":"a14edcc9-69c2-4daf-b586-10ef2f436669","Type":"ContainerDied","Data":"486474eeeca7650fd80a72513aed9e3721c0e6ad5849c269ca1a585e43401e2a"} Jan 25 05:57:28 crc kubenswrapper[4728]: I0125 05:57:28.429306 4728 generic.go:334] "Generic (PLEG): container finished" podID="a14edcc9-69c2-4daf-b586-10ef2f436669" containerID="486474eeeca7650fd80a72513aed9e3721c0e6ad5849c269ca1a585e43401e2a" exitCode=0 Jan 25 05:57:28 crc kubenswrapper[4728]: I0125 05:57:28.429885 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" event={"ID":"a14edcc9-69c2-4daf-b586-10ef2f436669","Type":"ContainerStarted","Data":"f18961914ed4564d71c4a0e35376c08b050b478fa1017b25b07915fa197eaf59"} Jan 25 05:57:28 crc kubenswrapper[4728]: I0125 05:57:28.439282 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"718dab40-f0af-4030-8a9c-2a3a10aa4737","Type":"ContainerStarted","Data":"d8700583498de598de513fcb5dc4e99aca988a3f7359accb3c44230811ae71ff"} Jan 25 05:57:29 crc kubenswrapper[4728]: I0125 05:57:29.454744 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" event={"ID":"a14edcc9-69c2-4daf-b586-10ef2f436669","Type":"ContainerStarted","Data":"ebe1c3ab21283393cfe30c422ba8100325f58e903d2dbd176b146eb618abfd06"} Jan 25 05:57:29 crc kubenswrapper[4728]: I0125 05:57:29.455277 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:29 crc kubenswrapper[4728]: I0125 05:57:29.458157 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"718dab40-f0af-4030-8a9c-2a3a10aa4737","Type":"ContainerStarted","Data":"90a614578954a671b7195c572a68aa84bf9769e0fc83e1e55bc9c53d9796ca79"} Jan 25 05:57:29 crc kubenswrapper[4728]: I0125 05:57:29.477570 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" podStartSLOduration=2.477558794 podStartE2EDuration="2.477558794s" podCreationTimestamp="2026-01-25 05:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:57:29.471919381 +0000 UTC m=+1140.507797362" watchObservedRunningTime="2026-01-25 05:57:29.477558794 +0000 UTC m=+1140.513436774" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.419489 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.472807 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd57d5db9-z9qlh"] Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.473315 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" podUID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" containerName="dnsmasq-dns" containerID="cri-o://e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c" gracePeriod=10 Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.636567 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74cf5bfd7f-hhq75"] Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.637938 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.644928 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cf5bfd7f-hhq75"] Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.826693 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-ovsdbserver-sb\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.826733 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.826751 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-config\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.826768 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-dns-svc\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.827126 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlbpf\" (UniqueName: \"kubernetes.io/projected/2e967299-2864-48a8-ba27-7d2a63f66c43-kube-api-access-jlbpf\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.827827 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-dns-swift-storage-0\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.827886 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-ovsdbserver-nb\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.924836 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.929972 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlbpf\" (UniqueName: \"kubernetes.io/projected/2e967299-2864-48a8-ba27-7d2a63f66c43-kube-api-access-jlbpf\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.930059 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-dns-swift-storage-0\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.930092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-ovsdbserver-nb\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.930138 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-ovsdbserver-sb\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.930157 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.930175 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-config\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.930196 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-dns-svc\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.931042 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-dns-svc\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.931520 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-ovsdbserver-sb\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.931818 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-config\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.932014 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.932194 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-dns-swift-storage-0\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.933501 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e967299-2864-48a8-ba27-7d2a63f66c43-ovsdbserver-nb\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:37 crc kubenswrapper[4728]: I0125 05:57:37.954685 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlbpf\" (UniqueName: \"kubernetes.io/projected/2e967299-2864-48a8-ba27-7d2a63f66c43-kube-api-access-jlbpf\") pod \"dnsmasq-dns-74cf5bfd7f-hhq75\" (UID: \"2e967299-2864-48a8-ba27-7d2a63f66c43\") " pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.031107 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-sb\") pod \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.031176 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx6w6\" (UniqueName: \"kubernetes.io/projected/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-kube-api-access-sx6w6\") pod \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.031229 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-svc\") pod \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.031253 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-swift-storage-0\") pod \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.031378 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-nb\") pod \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.031429 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-config\") pod \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\" (UID: \"4bc39392-7c35-4e6d-b06d-d0e6679bcd87\") " Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.035220 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-kube-api-access-sx6w6" (OuterVolumeSpecName: "kube-api-access-sx6w6") pod "4bc39392-7c35-4e6d-b06d-d0e6679bcd87" (UID: "4bc39392-7c35-4e6d-b06d-d0e6679bcd87"). InnerVolumeSpecName "kube-api-access-sx6w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.069892 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4bc39392-7c35-4e6d-b06d-d0e6679bcd87" (UID: "4bc39392-7c35-4e6d-b06d-d0e6679bcd87"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.069898 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-config" (OuterVolumeSpecName: "config") pod "4bc39392-7c35-4e6d-b06d-d0e6679bcd87" (UID: "4bc39392-7c35-4e6d-b06d-d0e6679bcd87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.070571 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4bc39392-7c35-4e6d-b06d-d0e6679bcd87" (UID: "4bc39392-7c35-4e6d-b06d-d0e6679bcd87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.075824 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4bc39392-7c35-4e6d-b06d-d0e6679bcd87" (UID: "4bc39392-7c35-4e6d-b06d-d0e6679bcd87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.078121 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4bc39392-7c35-4e6d-b06d-d0e6679bcd87" (UID: "4bc39392-7c35-4e6d-b06d-d0e6679bcd87"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.134295 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.134355 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.134372 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.134386 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx6w6\" (UniqueName: \"kubernetes.io/projected/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-kube-api-access-sx6w6\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.134402 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.134410 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4bc39392-7c35-4e6d-b06d-d0e6679bcd87-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.252075 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.564389 4728 generic.go:334] "Generic (PLEG): container finished" podID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" containerID="e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c" exitCode=0 Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.564839 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.564696 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" event={"ID":"4bc39392-7c35-4e6d-b06d-d0e6679bcd87","Type":"ContainerDied","Data":"e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c"} Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.565052 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd57d5db9-z9qlh" event={"ID":"4bc39392-7c35-4e6d-b06d-d0e6679bcd87","Type":"ContainerDied","Data":"d5899acfee929c19a305cdf7df82a35766dcfa8740ddac3e91f7491b6350c006"} Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.565088 4728 scope.go:117] "RemoveContainer" containerID="e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.596576 4728 scope.go:117] "RemoveContainer" containerID="6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.640000 4728 scope.go:117] "RemoveContainer" containerID="e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c" Jan 25 05:57:38 crc kubenswrapper[4728]: E0125 05:57:38.640556 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c\": container with ID starting with e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c not found: ID does not exist" containerID="e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.640594 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c"} err="failed to get container status \"e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c\": rpc error: code = NotFound desc = could not find container \"e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c\": container with ID starting with e9c1f24f22e3049c6952e74c2015c6833764610cf7eee49421a9c530fe74736c not found: ID does not exist" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.640626 4728 scope.go:117] "RemoveContainer" containerID="6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef" Jan 25 05:57:38 crc kubenswrapper[4728]: E0125 05:57:38.641175 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef\": container with ID starting with 6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef not found: ID does not exist" containerID="6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.641883 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef"} err="failed to get container status \"6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef\": rpc error: code = NotFound desc = could not find container \"6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef\": container with ID starting with 6abfbb7e9aace4e0486b8cc3575036fa4de47e3fb578675955c1594ce3c127ef not found: ID does not exist" Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.674411 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cf5bfd7f-hhq75"] Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.684415 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd57d5db9-z9qlh"] Jan 25 05:57:38 crc kubenswrapper[4728]: I0125 05:57:38.690765 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fd57d5db9-z9qlh"] Jan 25 05:57:39 crc kubenswrapper[4728]: I0125 05:57:39.337828 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" path="/var/lib/kubelet/pods/4bc39392-7c35-4e6d-b06d-d0e6679bcd87/volumes" Jan 25 05:57:39 crc kubenswrapper[4728]: I0125 05:57:39.576354 4728 generic.go:334] "Generic (PLEG): container finished" podID="2e967299-2864-48a8-ba27-7d2a63f66c43" containerID="c249f80b42347b3c5dd8eea5f770030266937e1ab5dc2bd569eb8550ce9ee696" exitCode=0 Jan 25 05:57:39 crc kubenswrapper[4728]: I0125 05:57:39.576405 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" event={"ID":"2e967299-2864-48a8-ba27-7d2a63f66c43","Type":"ContainerDied","Data":"c249f80b42347b3c5dd8eea5f770030266937e1ab5dc2bd569eb8550ce9ee696"} Jan 25 05:57:39 crc kubenswrapper[4728]: I0125 05:57:39.576437 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" event={"ID":"2e967299-2864-48a8-ba27-7d2a63f66c43","Type":"ContainerStarted","Data":"34cb017477f8c9033d7c7d49d324ce624f0bce6b903c2b4a125c59c9fd86d3ff"} Jan 25 05:57:40 crc kubenswrapper[4728]: I0125 05:57:40.589684 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" event={"ID":"2e967299-2864-48a8-ba27-7d2a63f66c43","Type":"ContainerStarted","Data":"bd5813085c30efb47cb8db421e98b172aab4c90c92ad76b701b576ec7dcdde1f"} Jan 25 05:57:40 crc kubenswrapper[4728]: I0125 05:57:40.590180 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:40 crc kubenswrapper[4728]: I0125 05:57:40.609703 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" podStartSLOduration=3.6096881 podStartE2EDuration="3.6096881s" podCreationTimestamp="2026-01-25 05:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:57:40.606334005 +0000 UTC m=+1151.642211985" watchObservedRunningTime="2026-01-25 05:57:40.6096881 +0000 UTC m=+1151.645566080" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.253487 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74cf5bfd7f-hhq75" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.300916 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-664b757dcf-jt56d"] Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.301200 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" podUID="a14edcc9-69c2-4daf-b586-10ef2f436669" containerName="dnsmasq-dns" containerID="cri-o://ebe1c3ab21283393cfe30c422ba8100325f58e903d2dbd176b146eb618abfd06" gracePeriod=10 Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.662729 4728 generic.go:334] "Generic (PLEG): container finished" podID="a14edcc9-69c2-4daf-b586-10ef2f436669" containerID="ebe1c3ab21283393cfe30c422ba8100325f58e903d2dbd176b146eb618abfd06" exitCode=0 Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.662804 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" event={"ID":"a14edcc9-69c2-4daf-b586-10ef2f436669","Type":"ContainerDied","Data":"ebe1c3ab21283393cfe30c422ba8100325f58e903d2dbd176b146eb618abfd06"} Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.737834 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.853029 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-svc\") pod \"a14edcc9-69c2-4daf-b586-10ef2f436669\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.853177 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wndr\" (UniqueName: \"kubernetes.io/projected/a14edcc9-69c2-4daf-b586-10ef2f436669-kube-api-access-8wndr\") pod \"a14edcc9-69c2-4daf-b586-10ef2f436669\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.853228 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-openstack-edpm-ipam\") pod \"a14edcc9-69c2-4daf-b586-10ef2f436669\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.853359 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-sb\") pod \"a14edcc9-69c2-4daf-b586-10ef2f436669\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.853384 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-swift-storage-0\") pod \"a14edcc9-69c2-4daf-b586-10ef2f436669\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.854011 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-config\") pod \"a14edcc9-69c2-4daf-b586-10ef2f436669\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.854087 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-nb\") pod \"a14edcc9-69c2-4daf-b586-10ef2f436669\" (UID: \"a14edcc9-69c2-4daf-b586-10ef2f436669\") " Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.858573 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14edcc9-69c2-4daf-b586-10ef2f436669-kube-api-access-8wndr" (OuterVolumeSpecName: "kube-api-access-8wndr") pod "a14edcc9-69c2-4daf-b586-10ef2f436669" (UID: "a14edcc9-69c2-4daf-b586-10ef2f436669"). InnerVolumeSpecName "kube-api-access-8wndr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.899038 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a14edcc9-69c2-4daf-b586-10ef2f436669" (UID: "a14edcc9-69c2-4daf-b586-10ef2f436669"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.899835 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a14edcc9-69c2-4daf-b586-10ef2f436669" (UID: "a14edcc9-69c2-4daf-b586-10ef2f436669"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.902100 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a14edcc9-69c2-4daf-b586-10ef2f436669" (UID: "a14edcc9-69c2-4daf-b586-10ef2f436669"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.905363 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-config" (OuterVolumeSpecName: "config") pod "a14edcc9-69c2-4daf-b586-10ef2f436669" (UID: "a14edcc9-69c2-4daf-b586-10ef2f436669"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.907408 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a14edcc9-69c2-4daf-b586-10ef2f436669" (UID: "a14edcc9-69c2-4daf-b586-10ef2f436669"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.912165 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a14edcc9-69c2-4daf-b586-10ef2f436669" (UID: "a14edcc9-69c2-4daf-b586-10ef2f436669"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.956907 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wndr\" (UniqueName: \"kubernetes.io/projected/a14edcc9-69c2-4daf-b586-10ef2f436669-kube-api-access-8wndr\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.956933 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.956945 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.956955 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.956966 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-config\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.956974 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:48 crc kubenswrapper[4728]: I0125 05:57:48.956982 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a14edcc9-69c2-4daf-b586-10ef2f436669-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 25 05:57:49 crc kubenswrapper[4728]: I0125 05:57:49.674393 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" event={"ID":"a14edcc9-69c2-4daf-b586-10ef2f436669","Type":"ContainerDied","Data":"f18961914ed4564d71c4a0e35376c08b050b478fa1017b25b07915fa197eaf59"} Jan 25 05:57:49 crc kubenswrapper[4728]: I0125 05:57:49.674472 4728 scope.go:117] "RemoveContainer" containerID="ebe1c3ab21283393cfe30c422ba8100325f58e903d2dbd176b146eb618abfd06" Jan 25 05:57:49 crc kubenswrapper[4728]: I0125 05:57:49.675480 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664b757dcf-jt56d" Jan 25 05:57:49 crc kubenswrapper[4728]: I0125 05:57:49.697064 4728 scope.go:117] "RemoveContainer" containerID="486474eeeca7650fd80a72513aed9e3721c0e6ad5849c269ca1a585e43401e2a" Jan 25 05:57:49 crc kubenswrapper[4728]: I0125 05:57:49.698465 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-664b757dcf-jt56d"] Jan 25 05:57:49 crc kubenswrapper[4728]: I0125 05:57:49.705997 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-664b757dcf-jt56d"] Jan 25 05:57:51 crc kubenswrapper[4728]: I0125 05:57:51.342487 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14edcc9-69c2-4daf-b586-10ef2f436669" path="/var/lib/kubelet/pods/a14edcc9-69c2-4daf-b586-10ef2f436669/volumes" Jan 25 05:58:00 crc kubenswrapper[4728]: I0125 05:58:00.784864 4728 generic.go:334] "Generic (PLEG): container finished" podID="4a9b861c-f271-4b2b-865e-925bf405c7d1" containerID="2ff585cf4c4a5114a49cc29737fc13fd5eec409fbbe9b398c54f846eeb5f2d06" exitCode=0 Jan 25 05:58:00 crc kubenswrapper[4728]: I0125 05:58:00.784981 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4a9b861c-f271-4b2b-865e-925bf405c7d1","Type":"ContainerDied","Data":"2ff585cf4c4a5114a49cc29737fc13fd5eec409fbbe9b398c54f846eeb5f2d06"} Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.279343 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5"] Jan 25 05:58:01 crc kubenswrapper[4728]: E0125 05:58:01.280038 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14edcc9-69c2-4daf-b586-10ef2f436669" containerName="init" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.280068 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14edcc9-69c2-4daf-b586-10ef2f436669" containerName="init" Jan 25 05:58:01 crc kubenswrapper[4728]: E0125 05:58:01.280093 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14edcc9-69c2-4daf-b586-10ef2f436669" containerName="dnsmasq-dns" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.280099 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14edcc9-69c2-4daf-b586-10ef2f436669" containerName="dnsmasq-dns" Jan 25 05:58:01 crc kubenswrapper[4728]: E0125 05:58:01.280115 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" containerName="init" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.280121 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" containerName="init" Jan 25 05:58:01 crc kubenswrapper[4728]: E0125 05:58:01.280130 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" containerName="dnsmasq-dns" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.280136 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" containerName="dnsmasq-dns" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.280359 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc39392-7c35-4e6d-b06d-d0e6679bcd87" containerName="dnsmasq-dns" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.280381 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14edcc9-69c2-4daf-b586-10ef2f436669" containerName="dnsmasq-dns" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.281087 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.282596 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.282773 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.282987 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.283510 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.289231 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5"] Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.292705 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.292843 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgn7v\" (UniqueName: \"kubernetes.io/projected/6c8ec845-8142-4a8d-95de-59cd6d159155-kube-api-access-dgn7v\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.293081 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.293192 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.394000 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.394073 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.394158 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgn7v\" (UniqueName: \"kubernetes.io/projected/6c8ec845-8142-4a8d-95de-59cd6d159155-kube-api-access-dgn7v\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.394223 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.399117 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.399130 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.400314 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.409770 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgn7v\" (UniqueName: \"kubernetes.io/projected/6c8ec845-8142-4a8d-95de-59cd6d159155-kube-api-access-dgn7v\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.594248 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.802561 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4a9b861c-f271-4b2b-865e-925bf405c7d1","Type":"ContainerStarted","Data":"a73928dd5b670bc7f8adbb6d74f495284b696af2c490c2740a73e370ec8823bd"} Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.803588 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.806669 4728 generic.go:334] "Generic (PLEG): container finished" podID="718dab40-f0af-4030-8a9c-2a3a10aa4737" containerID="90a614578954a671b7195c572a68aa84bf9769e0fc83e1e55bc9c53d9796ca79" exitCode=0 Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.806716 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"718dab40-f0af-4030-8a9c-2a3a10aa4737","Type":"ContainerDied","Data":"90a614578954a671b7195c572a68aa84bf9769e0fc83e1e55bc9c53d9796ca79"} Jan 25 05:58:01 crc kubenswrapper[4728]: I0125 05:58:01.834412 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.83431204 podStartE2EDuration="35.83431204s" podCreationTimestamp="2026-01-25 05:57:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:58:01.828387108 +0000 UTC m=+1172.864265089" watchObservedRunningTime="2026-01-25 05:58:01.83431204 +0000 UTC m=+1172.870190020" Jan 25 05:58:02 crc kubenswrapper[4728]: I0125 05:58:02.086313 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5"] Jan 25 05:58:02 crc kubenswrapper[4728]: W0125 05:58:02.091685 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c8ec845_8142_4a8d_95de_59cd6d159155.slice/crio-851980a8a3d9248700881fca562226f88836ea2ab418dc4be7b532560d1bb8d6 WatchSource:0}: Error finding container 851980a8a3d9248700881fca562226f88836ea2ab418dc4be7b532560d1bb8d6: Status 404 returned error can't find the container with id 851980a8a3d9248700881fca562226f88836ea2ab418dc4be7b532560d1bb8d6 Jan 25 05:58:02 crc kubenswrapper[4728]: I0125 05:58:02.094154 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 05:58:02 crc kubenswrapper[4728]: I0125 05:58:02.815855 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" event={"ID":"6c8ec845-8142-4a8d-95de-59cd6d159155","Type":"ContainerStarted","Data":"851980a8a3d9248700881fca562226f88836ea2ab418dc4be7b532560d1bb8d6"} Jan 25 05:58:02 crc kubenswrapper[4728]: I0125 05:58:02.817848 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"718dab40-f0af-4030-8a9c-2a3a10aa4737","Type":"ContainerStarted","Data":"2432679becfd2be62ad2f5a15bb00347d720d93b870dedd9862a7e17d141f420"} Jan 25 05:58:02 crc kubenswrapper[4728]: I0125 05:58:02.818226 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:58:02 crc kubenswrapper[4728]: I0125 05:58:02.844137 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.844118217 podStartE2EDuration="36.844118217s" podCreationTimestamp="2026-01-25 05:57:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 05:58:02.836265882 +0000 UTC m=+1173.872143863" watchObservedRunningTime="2026-01-25 05:58:02.844118217 +0000 UTC m=+1173.879996198" Jan 25 05:58:12 crc kubenswrapper[4728]: I0125 05:58:12.912883 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" event={"ID":"6c8ec845-8142-4a8d-95de-59cd6d159155","Type":"ContainerStarted","Data":"4131b8aa5df287f99dbe3a04753938bee7bf7e7e92e3829879740e49e9144275"} Jan 25 05:58:12 crc kubenswrapper[4728]: I0125 05:58:12.929200 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" podStartSLOduration=1.947566031 podStartE2EDuration="11.929187142s" podCreationTimestamp="2026-01-25 05:58:01 +0000 UTC" firstStartedPulling="2026-01-25 05:58:02.093964769 +0000 UTC m=+1173.129842749" lastFinishedPulling="2026-01-25 05:58:12.075585881 +0000 UTC m=+1183.111463860" observedRunningTime="2026-01-25 05:58:12.925077795 +0000 UTC m=+1183.960955776" watchObservedRunningTime="2026-01-25 05:58:12.929187142 +0000 UTC m=+1183.965065123" Jan 25 05:58:16 crc kubenswrapper[4728]: I0125 05:58:16.831211 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 25 05:58:17 crc kubenswrapper[4728]: I0125 05:58:17.204546 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 25 05:58:23 crc kubenswrapper[4728]: I0125 05:58:23.994597 4728 generic.go:334] "Generic (PLEG): container finished" podID="6c8ec845-8142-4a8d-95de-59cd6d159155" containerID="4131b8aa5df287f99dbe3a04753938bee7bf7e7e92e3829879740e49e9144275" exitCode=0 Jan 25 05:58:23 crc kubenswrapper[4728]: I0125 05:58:23.994683 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" event={"ID":"6c8ec845-8142-4a8d-95de-59cd6d159155","Type":"ContainerDied","Data":"4131b8aa5df287f99dbe3a04753938bee7bf7e7e92e3829879740e49e9144275"} Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.353149 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.393154 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-inventory\") pod \"6c8ec845-8142-4a8d-95de-59cd6d159155\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.393366 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-ssh-key-openstack-edpm-ipam\") pod \"6c8ec845-8142-4a8d-95de-59cd6d159155\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.393503 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgn7v\" (UniqueName: \"kubernetes.io/projected/6c8ec845-8142-4a8d-95de-59cd6d159155-kube-api-access-dgn7v\") pod \"6c8ec845-8142-4a8d-95de-59cd6d159155\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.393654 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-repo-setup-combined-ca-bundle\") pod \"6c8ec845-8142-4a8d-95de-59cd6d159155\" (UID: \"6c8ec845-8142-4a8d-95de-59cd6d159155\") " Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.398872 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6c8ec845-8142-4a8d-95de-59cd6d159155" (UID: "6c8ec845-8142-4a8d-95de-59cd6d159155"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.399002 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8ec845-8142-4a8d-95de-59cd6d159155-kube-api-access-dgn7v" (OuterVolumeSpecName: "kube-api-access-dgn7v") pod "6c8ec845-8142-4a8d-95de-59cd6d159155" (UID: "6c8ec845-8142-4a8d-95de-59cd6d159155"). InnerVolumeSpecName "kube-api-access-dgn7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.417147 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6c8ec845-8142-4a8d-95de-59cd6d159155" (UID: "6c8ec845-8142-4a8d-95de-59cd6d159155"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.419259 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-inventory" (OuterVolumeSpecName: "inventory") pod "6c8ec845-8142-4a8d-95de-59cd6d159155" (UID: "6c8ec845-8142-4a8d-95de-59cd6d159155"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.496058 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.496088 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgn7v\" (UniqueName: \"kubernetes.io/projected/6c8ec845-8142-4a8d-95de-59cd6d159155-kube-api-access-dgn7v\") on node \"crc\" DevicePath \"\"" Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.496097 4728 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 05:58:25 crc kubenswrapper[4728]: I0125 05:58:25.496108 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c8ec845-8142-4a8d-95de-59cd6d159155-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.012900 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" event={"ID":"6c8ec845-8142-4a8d-95de-59cd6d159155","Type":"ContainerDied","Data":"851980a8a3d9248700881fca562226f88836ea2ab418dc4be7b532560d1bb8d6"} Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.013209 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="851980a8a3d9248700881fca562226f88836ea2ab418dc4be7b532560d1bb8d6" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.012945 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.063755 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj"] Jan 25 05:58:26 crc kubenswrapper[4728]: E0125 05:58:26.064196 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8ec845-8142-4a8d-95de-59cd6d159155" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.064213 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8ec845-8142-4a8d-95de-59cd6d159155" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.064414 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8ec845-8142-4a8d-95de-59cd6d159155" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.065059 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.066238 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.066368 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.067549 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.068866 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.072737 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj"] Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.207998 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tknxm\" (UniqueName: \"kubernetes.io/projected/bee454bd-9662-4de3-ad06-204eaa3d2709-kube-api-access-tknxm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.208044 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.208136 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.310092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.310255 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tknxm\" (UniqueName: \"kubernetes.io/projected/bee454bd-9662-4de3-ad06-204eaa3d2709-kube-api-access-tknxm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.310281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.315519 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.315768 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.329269 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tknxm\" (UniqueName: \"kubernetes.io/projected/bee454bd-9662-4de3-ad06-204eaa3d2709-kube-api-access-tknxm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdxpj\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.378439 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:26 crc kubenswrapper[4728]: I0125 05:58:26.836494 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj"] Jan 25 05:58:27 crc kubenswrapper[4728]: I0125 05:58:27.023371 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" event={"ID":"bee454bd-9662-4de3-ad06-204eaa3d2709","Type":"ContainerStarted","Data":"72e1e9418ed0f27614f7a70253c86cde297a5a1115e2ab890ed2191eec1f1df5"} Jan 25 05:58:28 crc kubenswrapper[4728]: I0125 05:58:28.034117 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" event={"ID":"bee454bd-9662-4de3-ad06-204eaa3d2709","Type":"ContainerStarted","Data":"5338e19e5e739d451453f0015029475c0ab0d5839da5bf40a54dbbed0b9bc511"} Jan 25 05:58:28 crc kubenswrapper[4728]: I0125 05:58:28.057422 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" podStartSLOduration=1.5142753469999999 podStartE2EDuration="2.057405777s" podCreationTimestamp="2026-01-25 05:58:26 +0000 UTC" firstStartedPulling="2026-01-25 05:58:26.838246036 +0000 UTC m=+1197.874124016" lastFinishedPulling="2026-01-25 05:58:27.381376465 +0000 UTC m=+1198.417254446" observedRunningTime="2026-01-25 05:58:28.053256864 +0000 UTC m=+1199.089134844" watchObservedRunningTime="2026-01-25 05:58:28.057405777 +0000 UTC m=+1199.093283756" Jan 25 05:58:30 crc kubenswrapper[4728]: I0125 05:58:30.050020 4728 generic.go:334] "Generic (PLEG): container finished" podID="bee454bd-9662-4de3-ad06-204eaa3d2709" containerID="5338e19e5e739d451453f0015029475c0ab0d5839da5bf40a54dbbed0b9bc511" exitCode=0 Jan 25 05:58:30 crc kubenswrapper[4728]: I0125 05:58:30.050124 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" event={"ID":"bee454bd-9662-4de3-ad06-204eaa3d2709","Type":"ContainerDied","Data":"5338e19e5e739d451453f0015029475c0ab0d5839da5bf40a54dbbed0b9bc511"} Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.407863 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.520835 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-inventory\") pod \"bee454bd-9662-4de3-ad06-204eaa3d2709\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.520917 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tknxm\" (UniqueName: \"kubernetes.io/projected/bee454bd-9662-4de3-ad06-204eaa3d2709-kube-api-access-tknxm\") pod \"bee454bd-9662-4de3-ad06-204eaa3d2709\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.521119 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-ssh-key-openstack-edpm-ipam\") pod \"bee454bd-9662-4de3-ad06-204eaa3d2709\" (UID: \"bee454bd-9662-4de3-ad06-204eaa3d2709\") " Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.527014 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee454bd-9662-4de3-ad06-204eaa3d2709-kube-api-access-tknxm" (OuterVolumeSpecName: "kube-api-access-tknxm") pod "bee454bd-9662-4de3-ad06-204eaa3d2709" (UID: "bee454bd-9662-4de3-ad06-204eaa3d2709"). InnerVolumeSpecName "kube-api-access-tknxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.545827 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bee454bd-9662-4de3-ad06-204eaa3d2709" (UID: "bee454bd-9662-4de3-ad06-204eaa3d2709"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.547198 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-inventory" (OuterVolumeSpecName: "inventory") pod "bee454bd-9662-4de3-ad06-204eaa3d2709" (UID: "bee454bd-9662-4de3-ad06-204eaa3d2709"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.624612 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.624650 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tknxm\" (UniqueName: \"kubernetes.io/projected/bee454bd-9662-4de3-ad06-204eaa3d2709-kube-api-access-tknxm\") on node \"crc\" DevicePath \"\"" Jan 25 05:58:31 crc kubenswrapper[4728]: I0125 05:58:31.624668 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bee454bd-9662-4de3-ad06-204eaa3d2709-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.082655 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" event={"ID":"bee454bd-9662-4de3-ad06-204eaa3d2709","Type":"ContainerDied","Data":"72e1e9418ed0f27614f7a70253c86cde297a5a1115e2ab890ed2191eec1f1df5"} Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.082702 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72e1e9418ed0f27614f7a70253c86cde297a5a1115e2ab890ed2191eec1f1df5" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.082733 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdxpj" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.142580 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd"] Jan 25 05:58:32 crc kubenswrapper[4728]: E0125 05:58:32.143599 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee454bd-9662-4de3-ad06-204eaa3d2709" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.143630 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee454bd-9662-4de3-ad06-204eaa3d2709" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.144159 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee454bd-9662-4de3-ad06-204eaa3d2709" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.145300 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.150284 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.151809 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.156630 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.156700 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.163542 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd"] Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.341058 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.341121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9pjr\" (UniqueName: \"kubernetes.io/projected/18badfbd-fe91-4d6e-8ecd-765ed6994030-kube-api-access-c9pjr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.341301 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.341434 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.444088 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.444460 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9pjr\" (UniqueName: \"kubernetes.io/projected/18badfbd-fe91-4d6e-8ecd-765ed6994030-kube-api-access-c9pjr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.444515 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.444579 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.449792 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.450009 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.451343 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.459671 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9pjr\" (UniqueName: \"kubernetes.io/projected/18badfbd-fe91-4d6e-8ecd-765ed6994030-kube-api-access-c9pjr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.466500 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 05:58:32 crc kubenswrapper[4728]: W0125 05:58:32.949095 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18badfbd_fe91_4d6e_8ecd_765ed6994030.slice/crio-f1db8ca58525b495213d3005c3b52a3523ee618a4640379bbc679503f171257c WatchSource:0}: Error finding container f1db8ca58525b495213d3005c3b52a3523ee618a4640379bbc679503f171257c: Status 404 returned error can't find the container with id f1db8ca58525b495213d3005c3b52a3523ee618a4640379bbc679503f171257c Jan 25 05:58:32 crc kubenswrapper[4728]: I0125 05:58:32.950189 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd"] Jan 25 05:58:33 crc kubenswrapper[4728]: I0125 05:58:33.096589 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" event={"ID":"18badfbd-fe91-4d6e-8ecd-765ed6994030","Type":"ContainerStarted","Data":"f1db8ca58525b495213d3005c3b52a3523ee618a4640379bbc679503f171257c"} Jan 25 05:58:34 crc kubenswrapper[4728]: I0125 05:58:34.113087 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" event={"ID":"18badfbd-fe91-4d6e-8ecd-765ed6994030","Type":"ContainerStarted","Data":"093977345b894e08e2155f33b995b6ae6d738d51573c3ac87b9accae6b51b31c"} Jan 25 05:58:34 crc kubenswrapper[4728]: I0125 05:58:34.133135 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" podStartSLOduration=1.642842222 podStartE2EDuration="2.133108559s" podCreationTimestamp="2026-01-25 05:58:32 +0000 UTC" firstStartedPulling="2026-01-25 05:58:32.9517017 +0000 UTC m=+1203.987579680" lastFinishedPulling="2026-01-25 05:58:33.441968038 +0000 UTC m=+1204.477846017" observedRunningTime="2026-01-25 05:58:34.124939067 +0000 UTC m=+1205.160817047" watchObservedRunningTime="2026-01-25 05:58:34.133108559 +0000 UTC m=+1205.168986540" Jan 25 05:58:42 crc kubenswrapper[4728]: I0125 05:58:42.899862 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:58:42 crc kubenswrapper[4728]: I0125 05:58:42.900537 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:59:12 crc kubenswrapper[4728]: I0125 05:59:12.899737 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:59:12 crc kubenswrapper[4728]: I0125 05:59:12.900448 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:59:31 crc kubenswrapper[4728]: I0125 05:59:31.630275 4728 scope.go:117] "RemoveContainer" containerID="3328c6210345602068e557b5ee3b4c617763769aee701f8b67a4a22a054943a4" Jan 25 05:59:42 crc kubenswrapper[4728]: I0125 05:59:42.899602 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 05:59:42 crc kubenswrapper[4728]: I0125 05:59:42.900265 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 05:59:42 crc kubenswrapper[4728]: I0125 05:59:42.900342 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 05:59:42 crc kubenswrapper[4728]: I0125 05:59:42.901274 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd316250bf57712586994889b62bcccbaedbf4eba29b23e84c2d634ac0c7e82a"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 05:59:42 crc kubenswrapper[4728]: I0125 05:59:42.901342 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://fd316250bf57712586994889b62bcccbaedbf4eba29b23e84c2d634ac0c7e82a" gracePeriod=600 Jan 25 05:59:43 crc kubenswrapper[4728]: I0125 05:59:43.717057 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="fd316250bf57712586994889b62bcccbaedbf4eba29b23e84c2d634ac0c7e82a" exitCode=0 Jan 25 05:59:43 crc kubenswrapper[4728]: I0125 05:59:43.717159 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"fd316250bf57712586994889b62bcccbaedbf4eba29b23e84c2d634ac0c7e82a"} Jan 25 05:59:43 crc kubenswrapper[4728]: I0125 05:59:43.717721 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"0a4f5085ca82c5966b307e41854c331b5cdb7a2b51d47df89db30b0d0eaf56ac"} Jan 25 05:59:43 crc kubenswrapper[4728]: I0125 05:59:43.717752 4728 scope.go:117] "RemoveContainer" containerID="cfcdf54d823ad6beb0133f29e917610e444d0fa6cfe06f430b6751fe7dbea675" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.150121 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt"] Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.152446 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.154936 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.157149 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.159922 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt"] Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.179532 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65ww\" (UniqueName: \"kubernetes.io/projected/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-kube-api-access-h65ww\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.179884 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-secret-volume\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.180078 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-config-volume\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.282375 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65ww\" (UniqueName: \"kubernetes.io/projected/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-kube-api-access-h65ww\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.282655 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-secret-volume\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.282821 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-config-volume\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.283935 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-config-volume\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.287397 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-secret-volume\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.297671 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65ww\" (UniqueName: \"kubernetes.io/projected/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-kube-api-access-h65ww\") pod \"collect-profiles-29488680-9zlxt\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.470344 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:00 crc kubenswrapper[4728]: I0125 06:00:00.873773 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt"] Jan 25 06:00:00 crc kubenswrapper[4728]: W0125 06:00:00.877904 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45dfb47_684c_467a_ab58_23d5bcd3bc7c.slice/crio-5702ad5127401a6380981029f57561ce6ce018c9338ae41335729b4685addda5 WatchSource:0}: Error finding container 5702ad5127401a6380981029f57561ce6ce018c9338ae41335729b4685addda5: Status 404 returned error can't find the container with id 5702ad5127401a6380981029f57561ce6ce018c9338ae41335729b4685addda5 Jan 25 06:00:01 crc kubenswrapper[4728]: I0125 06:00:01.876209 4728 generic.go:334] "Generic (PLEG): container finished" podID="e45dfb47-684c-467a-ab58-23d5bcd3bc7c" containerID="ff455dda0b1ac8a73b7c07b0b5161145066912a1be24f574b84fee5fbf9d1bd8" exitCode=0 Jan 25 06:00:01 crc kubenswrapper[4728]: I0125 06:00:01.876271 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" event={"ID":"e45dfb47-684c-467a-ab58-23d5bcd3bc7c","Type":"ContainerDied","Data":"ff455dda0b1ac8a73b7c07b0b5161145066912a1be24f574b84fee5fbf9d1bd8"} Jan 25 06:00:01 crc kubenswrapper[4728]: I0125 06:00:01.876521 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" event={"ID":"e45dfb47-684c-467a-ab58-23d5bcd3bc7c","Type":"ContainerStarted","Data":"5702ad5127401a6380981029f57561ce6ce018c9338ae41335729b4685addda5"} Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.151701 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.248244 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-config-volume\") pod \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.248752 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-config-volume" (OuterVolumeSpecName: "config-volume") pod "e45dfb47-684c-467a-ab58-23d5bcd3bc7c" (UID: "e45dfb47-684c-467a-ab58-23d5bcd3bc7c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.248971 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.349825 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-secret-volume\") pod \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.349882 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h65ww\" (UniqueName: \"kubernetes.io/projected/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-kube-api-access-h65ww\") pod \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\" (UID: \"e45dfb47-684c-467a-ab58-23d5bcd3bc7c\") " Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.357199 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e45dfb47-684c-467a-ab58-23d5bcd3bc7c" (UID: "e45dfb47-684c-467a-ab58-23d5bcd3bc7c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.357412 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-kube-api-access-h65ww" (OuterVolumeSpecName: "kube-api-access-h65ww") pod "e45dfb47-684c-467a-ab58-23d5bcd3bc7c" (UID: "e45dfb47-684c-467a-ab58-23d5bcd3bc7c"). InnerVolumeSpecName "kube-api-access-h65ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.453284 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.453336 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h65ww\" (UniqueName: \"kubernetes.io/projected/e45dfb47-684c-467a-ab58-23d5bcd3bc7c-kube-api-access-h65ww\") on node \"crc\" DevicePath \"\"" Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.899512 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" event={"ID":"e45dfb47-684c-467a-ab58-23d5bcd3bc7c","Type":"ContainerDied","Data":"5702ad5127401a6380981029f57561ce6ce018c9338ae41335729b4685addda5"} Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.899845 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5702ad5127401a6380981029f57561ce6ce018c9338ae41335729b4685addda5" Jan 25 06:00:03 crc kubenswrapper[4728]: I0125 06:00:03.899592 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt" Jan 25 06:00:31 crc kubenswrapper[4728]: I0125 06:00:31.684843 4728 scope.go:117] "RemoveContainer" containerID="c495a7f15f59212d489ad77b5670c644939fc7c28a041aa53fa8eebc4dee7adf" Jan 25 06:00:31 crc kubenswrapper[4728]: I0125 06:00:31.716498 4728 scope.go:117] "RemoveContainer" containerID="3f2a317de61da75b817d6e155322b6bc5f4b316dcebdaeaa17a12b0dcb7c813e" Jan 25 06:00:31 crc kubenswrapper[4728]: I0125 06:00:31.748435 4728 scope.go:117] "RemoveContainer" containerID="0b02f2609214d5e0a973fa589ad8eed9a37401113ec309b20c498e1baab1ead5" Jan 25 06:00:31 crc kubenswrapper[4728]: I0125 06:00:31.769827 4728 scope.go:117] "RemoveContainer" containerID="839b651ed2bf0e19db19a6d10dc9b1a3c5f81bec38ce35088fc0504b7c9b36a7" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.142923 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29488681-j86cs"] Jan 25 06:01:00 crc kubenswrapper[4728]: E0125 06:01:00.143978 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45dfb47-684c-467a-ab58-23d5bcd3bc7c" containerName="collect-profiles" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.143995 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45dfb47-684c-467a-ab58-23d5bcd3bc7c" containerName="collect-profiles" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.144193 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45dfb47-684c-467a-ab58-23d5bcd3bc7c" containerName="collect-profiles" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.144960 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.148632 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29488681-j86cs"] Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.245419 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-config-data\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.245525 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-fernet-keys\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.245709 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pthr\" (UniqueName: \"kubernetes.io/projected/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-kube-api-access-2pthr\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.245965 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-combined-ca-bundle\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.348941 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pthr\" (UniqueName: \"kubernetes.io/projected/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-kube-api-access-2pthr\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.349106 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-combined-ca-bundle\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.349155 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-config-data\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.349188 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-fernet-keys\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.356620 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-combined-ca-bundle\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.356991 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-fernet-keys\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.357051 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-config-data\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.364607 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pthr\" (UniqueName: \"kubernetes.io/projected/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-kube-api-access-2pthr\") pod \"keystone-cron-29488681-j86cs\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.460410 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:00 crc kubenswrapper[4728]: I0125 06:01:00.876375 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29488681-j86cs"] Jan 25 06:01:01 crc kubenswrapper[4728]: I0125 06:01:01.425724 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29488681-j86cs" event={"ID":"6197fee2-5bbd-4edd-bcb5-c10f476f4f83","Type":"ContainerStarted","Data":"489a5f75f147319cd8be3f537614fe50068233d48d50df96b11c52acad305525"} Jan 25 06:01:01 crc kubenswrapper[4728]: I0125 06:01:01.425992 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29488681-j86cs" event={"ID":"6197fee2-5bbd-4edd-bcb5-c10f476f4f83","Type":"ContainerStarted","Data":"5ecf52d29d8070c641f59f1d167fc2afd1feaf15b4a2e0a98926b31c3a7d7a15"} Jan 25 06:01:01 crc kubenswrapper[4728]: I0125 06:01:01.459681 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29488681-j86cs" podStartSLOduration=1.45966679 podStartE2EDuration="1.45966679s" podCreationTimestamp="2026-01-25 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 06:01:01.455727934 +0000 UTC m=+1352.491605914" watchObservedRunningTime="2026-01-25 06:01:01.45966679 +0000 UTC m=+1352.495544770" Jan 25 06:01:03 crc kubenswrapper[4728]: I0125 06:01:03.454597 4728 generic.go:334] "Generic (PLEG): container finished" podID="6197fee2-5bbd-4edd-bcb5-c10f476f4f83" containerID="489a5f75f147319cd8be3f537614fe50068233d48d50df96b11c52acad305525" exitCode=0 Jan 25 06:01:03 crc kubenswrapper[4728]: I0125 06:01:03.454675 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29488681-j86cs" event={"ID":"6197fee2-5bbd-4edd-bcb5-c10f476f4f83","Type":"ContainerDied","Data":"489a5f75f147319cd8be3f537614fe50068233d48d50df96b11c52acad305525"} Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.731425 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.845688 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-config-data\") pod \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.845820 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pthr\" (UniqueName: \"kubernetes.io/projected/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-kube-api-access-2pthr\") pod \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.845896 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-combined-ca-bundle\") pod \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.845973 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-fernet-keys\") pod \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\" (UID: \"6197fee2-5bbd-4edd-bcb5-c10f476f4f83\") " Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.852760 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6197fee2-5bbd-4edd-bcb5-c10f476f4f83" (UID: "6197fee2-5bbd-4edd-bcb5-c10f476f4f83"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.854351 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-kube-api-access-2pthr" (OuterVolumeSpecName: "kube-api-access-2pthr") pod "6197fee2-5bbd-4edd-bcb5-c10f476f4f83" (UID: "6197fee2-5bbd-4edd-bcb5-c10f476f4f83"). InnerVolumeSpecName "kube-api-access-2pthr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.873147 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6197fee2-5bbd-4edd-bcb5-c10f476f4f83" (UID: "6197fee2-5bbd-4edd-bcb5-c10f476f4f83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.894782 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-config-data" (OuterVolumeSpecName: "config-data") pod "6197fee2-5bbd-4edd-bcb5-c10f476f4f83" (UID: "6197fee2-5bbd-4edd-bcb5-c10f476f4f83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.949274 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.949312 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pthr\" (UniqueName: \"kubernetes.io/projected/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-kube-api-access-2pthr\") on node \"crc\" DevicePath \"\"" Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.949356 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:01:04 crc kubenswrapper[4728]: I0125 06:01:04.949366 4728 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6197fee2-5bbd-4edd-bcb5-c10f476f4f83-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 25 06:01:05 crc kubenswrapper[4728]: I0125 06:01:05.472009 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29488681-j86cs" event={"ID":"6197fee2-5bbd-4edd-bcb5-c10f476f4f83","Type":"ContainerDied","Data":"5ecf52d29d8070c641f59f1d167fc2afd1feaf15b4a2e0a98926b31c3a7d7a15"} Jan 25 06:01:05 crc kubenswrapper[4728]: I0125 06:01:05.472064 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ecf52d29d8070c641f59f1d167fc2afd1feaf15b4a2e0a98926b31c3a7d7a15" Jan 25 06:01:05 crc kubenswrapper[4728]: I0125 06:01:05.472100 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29488681-j86cs" Jan 25 06:01:22 crc kubenswrapper[4728]: I0125 06:01:22.615405 4728 generic.go:334] "Generic (PLEG): container finished" podID="18badfbd-fe91-4d6e-8ecd-765ed6994030" containerID="093977345b894e08e2155f33b995b6ae6d738d51573c3ac87b9accae6b51b31c" exitCode=0 Jan 25 06:01:22 crc kubenswrapper[4728]: I0125 06:01:22.615440 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" event={"ID":"18badfbd-fe91-4d6e-8ecd-765ed6994030","Type":"ContainerDied","Data":"093977345b894e08e2155f33b995b6ae6d738d51573c3ac87b9accae6b51b31c"} Jan 25 06:01:23 crc kubenswrapper[4728]: I0125 06:01:23.937617 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.034234 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-bootstrap-combined-ca-bundle\") pod \"18badfbd-fe91-4d6e-8ecd-765ed6994030\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.034314 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-ssh-key-openstack-edpm-ipam\") pod \"18badfbd-fe91-4d6e-8ecd-765ed6994030\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.034507 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-inventory\") pod \"18badfbd-fe91-4d6e-8ecd-765ed6994030\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.034626 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9pjr\" (UniqueName: \"kubernetes.io/projected/18badfbd-fe91-4d6e-8ecd-765ed6994030-kube-api-access-c9pjr\") pod \"18badfbd-fe91-4d6e-8ecd-765ed6994030\" (UID: \"18badfbd-fe91-4d6e-8ecd-765ed6994030\") " Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.041556 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18badfbd-fe91-4d6e-8ecd-765ed6994030-kube-api-access-c9pjr" (OuterVolumeSpecName: "kube-api-access-c9pjr") pod "18badfbd-fe91-4d6e-8ecd-765ed6994030" (UID: "18badfbd-fe91-4d6e-8ecd-765ed6994030"). InnerVolumeSpecName "kube-api-access-c9pjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.042030 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "18badfbd-fe91-4d6e-8ecd-765ed6994030" (UID: "18badfbd-fe91-4d6e-8ecd-765ed6994030"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.060050 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-inventory" (OuterVolumeSpecName: "inventory") pod "18badfbd-fe91-4d6e-8ecd-765ed6994030" (UID: "18badfbd-fe91-4d6e-8ecd-765ed6994030"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.061250 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "18badfbd-fe91-4d6e-8ecd-765ed6994030" (UID: "18badfbd-fe91-4d6e-8ecd-765ed6994030"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.137754 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.138063 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9pjr\" (UniqueName: \"kubernetes.io/projected/18badfbd-fe91-4d6e-8ecd-765ed6994030-kube-api-access-c9pjr\") on node \"crc\" DevicePath \"\"" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.138077 4728 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.138086 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18badfbd-fe91-4d6e-8ecd-765ed6994030-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.636666 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" event={"ID":"18badfbd-fe91-4d6e-8ecd-765ed6994030","Type":"ContainerDied","Data":"f1db8ca58525b495213d3005c3b52a3523ee618a4640379bbc679503f171257c"} Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.636716 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1db8ca58525b495213d3005c3b52a3523ee618a4640379bbc679503f171257c" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.636793 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.701075 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk"] Jan 25 06:01:24 crc kubenswrapper[4728]: E0125 06:01:24.701499 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6197fee2-5bbd-4edd-bcb5-c10f476f4f83" containerName="keystone-cron" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.701516 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6197fee2-5bbd-4edd-bcb5-c10f476f4f83" containerName="keystone-cron" Jan 25 06:01:24 crc kubenswrapper[4728]: E0125 06:01:24.701539 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18badfbd-fe91-4d6e-8ecd-765ed6994030" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.701548 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="18badfbd-fe91-4d6e-8ecd-765ed6994030" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.701718 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6197fee2-5bbd-4edd-bcb5-c10f476f4f83" containerName="keystone-cron" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.701744 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="18badfbd-fe91-4d6e-8ecd-765ed6994030" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.702424 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.704803 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.704885 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.705184 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.705471 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.712583 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk"] Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.849589 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dmn\" (UniqueName: \"kubernetes.io/projected/abcaa620-a9bf-4edf-a044-ea75ca9fa872-kube-api-access-x2dmn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.849786 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.849983 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.951693 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.951792 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dmn\" (UniqueName: \"kubernetes.io/projected/abcaa620-a9bf-4edf-a044-ea75ca9fa872-kube-api-access-x2dmn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.951856 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.956180 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.956184 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:24 crc kubenswrapper[4728]: I0125 06:01:24.966334 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dmn\" (UniqueName: \"kubernetes.io/projected/abcaa620-a9bf-4edf-a044-ea75ca9fa872-kube-api-access-x2dmn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-665xk\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:25 crc kubenswrapper[4728]: I0125 06:01:25.016832 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:01:25 crc kubenswrapper[4728]: I0125 06:01:25.487452 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk"] Jan 25 06:01:25 crc kubenswrapper[4728]: I0125 06:01:25.645620 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" event={"ID":"abcaa620-a9bf-4edf-a044-ea75ca9fa872","Type":"ContainerStarted","Data":"bddc9da1fe8a323268364ed644cd34bf825ab1a235379d6ba118d88264a913e4"} Jan 25 06:01:26 crc kubenswrapper[4728]: I0125 06:01:26.659444 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" event={"ID":"abcaa620-a9bf-4edf-a044-ea75ca9fa872","Type":"ContainerStarted","Data":"1ff6bf2a64652e928ab00f0835e797b526f0c2bab773aa1e3912d9216ee94be7"} Jan 25 06:01:26 crc kubenswrapper[4728]: I0125 06:01:26.687296 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" podStartSLOduration=2.180080354 podStartE2EDuration="2.687277441s" podCreationTimestamp="2026-01-25 06:01:24 +0000 UTC" firstStartedPulling="2026-01-25 06:01:25.489507308 +0000 UTC m=+1376.525385288" lastFinishedPulling="2026-01-25 06:01:25.996704396 +0000 UTC m=+1377.032582375" observedRunningTime="2026-01-25 06:01:26.67654781 +0000 UTC m=+1377.712425790" watchObservedRunningTime="2026-01-25 06:01:26.687277441 +0000 UTC m=+1377.723155420" Jan 25 06:02:12 crc kubenswrapper[4728]: I0125 06:02:12.899556 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:02:12 crc kubenswrapper[4728]: I0125 06:02:12.900017 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.214025 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sb9cb"] Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.216813 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.226397 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sb9cb"] Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.282334 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzwxc\" (UniqueName: \"kubernetes.io/projected/265ca283-a737-488e-88a0-182eb5b88601-kube-api-access-wzwxc\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.282519 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-utilities\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.282655 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-catalog-content\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.385090 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzwxc\" (UniqueName: \"kubernetes.io/projected/265ca283-a737-488e-88a0-182eb5b88601-kube-api-access-wzwxc\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.385262 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-utilities\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.385365 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-catalog-content\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.386014 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-catalog-content\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.386093 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-utilities\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.403299 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzwxc\" (UniqueName: \"kubernetes.io/projected/265ca283-a737-488e-88a0-182eb5b88601-kube-api-access-wzwxc\") pod \"redhat-operators-sb9cb\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.535379 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:37 crc kubenswrapper[4728]: I0125 06:02:37.929761 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sb9cb"] Jan 25 06:02:37 crc kubenswrapper[4728]: W0125 06:02:37.930377 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod265ca283_a737_488e_88a0_182eb5b88601.slice/crio-1b49166ffc9a52c4b6c56c065aa31ab613668264806229c633caa557516cbd41 WatchSource:0}: Error finding container 1b49166ffc9a52c4b6c56c065aa31ab613668264806229c633caa557516cbd41: Status 404 returned error can't find the container with id 1b49166ffc9a52c4b6c56c065aa31ab613668264806229c633caa557516cbd41 Jan 25 06:02:38 crc kubenswrapper[4728]: I0125 06:02:38.271299 4728 generic.go:334] "Generic (PLEG): container finished" podID="265ca283-a737-488e-88a0-182eb5b88601" containerID="750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01" exitCode=0 Jan 25 06:02:38 crc kubenswrapper[4728]: I0125 06:02:38.271536 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb9cb" event={"ID":"265ca283-a737-488e-88a0-182eb5b88601","Type":"ContainerDied","Data":"750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01"} Jan 25 06:02:38 crc kubenswrapper[4728]: I0125 06:02:38.271569 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb9cb" event={"ID":"265ca283-a737-488e-88a0-182eb5b88601","Type":"ContainerStarted","Data":"1b49166ffc9a52c4b6c56c065aa31ab613668264806229c633caa557516cbd41"} Jan 25 06:02:39 crc kubenswrapper[4728]: I0125 06:02:39.279875 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb9cb" event={"ID":"265ca283-a737-488e-88a0-182eb5b88601","Type":"ContainerStarted","Data":"0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5"} Jan 25 06:02:41 crc kubenswrapper[4728]: I0125 06:02:41.296787 4728 generic.go:334] "Generic (PLEG): container finished" podID="265ca283-a737-488e-88a0-182eb5b88601" containerID="0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5" exitCode=0 Jan 25 06:02:41 crc kubenswrapper[4728]: I0125 06:02:41.296835 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb9cb" event={"ID":"265ca283-a737-488e-88a0-182eb5b88601","Type":"ContainerDied","Data":"0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5"} Jan 25 06:02:42 crc kubenswrapper[4728]: I0125 06:02:42.306637 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb9cb" event={"ID":"265ca283-a737-488e-88a0-182eb5b88601","Type":"ContainerStarted","Data":"d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa"} Jan 25 06:02:42 crc kubenswrapper[4728]: I0125 06:02:42.327531 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sb9cb" podStartSLOduration=1.7803698049999999 podStartE2EDuration="5.327513469s" podCreationTimestamp="2026-01-25 06:02:37 +0000 UTC" firstStartedPulling="2026-01-25 06:02:38.273215377 +0000 UTC m=+1449.309093358" lastFinishedPulling="2026-01-25 06:02:41.820359042 +0000 UTC m=+1452.856237022" observedRunningTime="2026-01-25 06:02:42.321248647 +0000 UTC m=+1453.357126627" watchObservedRunningTime="2026-01-25 06:02:42.327513469 +0000 UTC m=+1453.363391449" Jan 25 06:02:42 crc kubenswrapper[4728]: I0125 06:02:42.899882 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:02:42 crc kubenswrapper[4728]: I0125 06:02:42.899950 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:02:47 crc kubenswrapper[4728]: I0125 06:02:47.535832 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:47 crc kubenswrapper[4728]: I0125 06:02:47.536198 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:47 crc kubenswrapper[4728]: I0125 06:02:47.570439 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:48 crc kubenswrapper[4728]: I0125 06:02:48.381307 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:48 crc kubenswrapper[4728]: I0125 06:02:48.424755 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sb9cb"] Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.362766 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sb9cb" podUID="265ca283-a737-488e-88a0-182eb5b88601" containerName="registry-server" containerID="cri-o://d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa" gracePeriod=2 Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.746647 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.821625 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-catalog-content\") pod \"265ca283-a737-488e-88a0-182eb5b88601\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.821767 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-utilities\") pod \"265ca283-a737-488e-88a0-182eb5b88601\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.821853 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzwxc\" (UniqueName: \"kubernetes.io/projected/265ca283-a737-488e-88a0-182eb5b88601-kube-api-access-wzwxc\") pod \"265ca283-a737-488e-88a0-182eb5b88601\" (UID: \"265ca283-a737-488e-88a0-182eb5b88601\") " Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.822527 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-utilities" (OuterVolumeSpecName: "utilities") pod "265ca283-a737-488e-88a0-182eb5b88601" (UID: "265ca283-a737-488e-88a0-182eb5b88601"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.827415 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265ca283-a737-488e-88a0-182eb5b88601-kube-api-access-wzwxc" (OuterVolumeSpecName: "kube-api-access-wzwxc") pod "265ca283-a737-488e-88a0-182eb5b88601" (UID: "265ca283-a737-488e-88a0-182eb5b88601"). InnerVolumeSpecName "kube-api-access-wzwxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.917272 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "265ca283-a737-488e-88a0-182eb5b88601" (UID: "265ca283-a737-488e-88a0-182eb5b88601"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.924720 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.924752 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzwxc\" (UniqueName: \"kubernetes.io/projected/265ca283-a737-488e-88a0-182eb5b88601-kube-api-access-wzwxc\") on node \"crc\" DevicePath \"\"" Jan 25 06:02:50 crc kubenswrapper[4728]: I0125 06:02:50.924765 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ca283-a737-488e-88a0-182eb5b88601-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.372469 4728 generic.go:334] "Generic (PLEG): container finished" podID="265ca283-a737-488e-88a0-182eb5b88601" containerID="d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa" exitCode=0 Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.372515 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb9cb" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.372534 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb9cb" event={"ID":"265ca283-a737-488e-88a0-182eb5b88601","Type":"ContainerDied","Data":"d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa"} Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.372798 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb9cb" event={"ID":"265ca283-a737-488e-88a0-182eb5b88601","Type":"ContainerDied","Data":"1b49166ffc9a52c4b6c56c065aa31ab613668264806229c633caa557516cbd41"} Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.372817 4728 scope.go:117] "RemoveContainer" containerID="d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.392632 4728 scope.go:117] "RemoveContainer" containerID="0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.394991 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sb9cb"] Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.403094 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sb9cb"] Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.413906 4728 scope.go:117] "RemoveContainer" containerID="750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.447821 4728 scope.go:117] "RemoveContainer" containerID="d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa" Jan 25 06:02:51 crc kubenswrapper[4728]: E0125 06:02:51.448265 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa\": container with ID starting with d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa not found: ID does not exist" containerID="d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.448303 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa"} err="failed to get container status \"d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa\": rpc error: code = NotFound desc = could not find container \"d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa\": container with ID starting with d7efb13d01c9bff57f16a619a5ef682e337a8289eb6f1815886bd02bfced30fa not found: ID does not exist" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.448360 4728 scope.go:117] "RemoveContainer" containerID="0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5" Jan 25 06:02:51 crc kubenswrapper[4728]: E0125 06:02:51.448621 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5\": container with ID starting with 0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5 not found: ID does not exist" containerID="0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.448651 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5"} err="failed to get container status \"0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5\": rpc error: code = NotFound desc = could not find container \"0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5\": container with ID starting with 0df99fc81f292cdbaca37989799606d62ea5c81319a7724b05e9e4df8ae2d5a5 not found: ID does not exist" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.448666 4728 scope.go:117] "RemoveContainer" containerID="750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01" Jan 25 06:02:51 crc kubenswrapper[4728]: E0125 06:02:51.448892 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01\": container with ID starting with 750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01 not found: ID does not exist" containerID="750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01" Jan 25 06:02:51 crc kubenswrapper[4728]: I0125 06:02:51.448917 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01"} err="failed to get container status \"750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01\": rpc error: code = NotFound desc = could not find container \"750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01\": container with ID starting with 750f48531a2504d931d0a4565253dda953ccfea0549f5915a01d059d91648b01 not found: ID does not exist" Jan 25 06:02:53 crc kubenswrapper[4728]: I0125 06:02:53.342558 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265ca283-a737-488e-88a0-182eb5b88601" path="/var/lib/kubelet/pods/265ca283-a737-488e-88a0-182eb5b88601/volumes" Jan 25 06:03:09 crc kubenswrapper[4728]: I0125 06:03:09.514300 4728 generic.go:334] "Generic (PLEG): container finished" podID="abcaa620-a9bf-4edf-a044-ea75ca9fa872" containerID="1ff6bf2a64652e928ab00f0835e797b526f0c2bab773aa1e3912d9216ee94be7" exitCode=0 Jan 25 06:03:09 crc kubenswrapper[4728]: I0125 06:03:09.514341 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" event={"ID":"abcaa620-a9bf-4edf-a044-ea75ca9fa872","Type":"ContainerDied","Data":"1ff6bf2a64652e928ab00f0835e797b526f0c2bab773aa1e3912d9216ee94be7"} Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.811838 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.826687 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2dmn\" (UniqueName: \"kubernetes.io/projected/abcaa620-a9bf-4edf-a044-ea75ca9fa872-kube-api-access-x2dmn\") pod \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.826760 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-ssh-key-openstack-edpm-ipam\") pod \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.826814 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-inventory\") pod \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\" (UID: \"abcaa620-a9bf-4edf-a044-ea75ca9fa872\") " Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.831450 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abcaa620-a9bf-4edf-a044-ea75ca9fa872-kube-api-access-x2dmn" (OuterVolumeSpecName: "kube-api-access-x2dmn") pod "abcaa620-a9bf-4edf-a044-ea75ca9fa872" (UID: "abcaa620-a9bf-4edf-a044-ea75ca9fa872"). InnerVolumeSpecName "kube-api-access-x2dmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.848128 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-inventory" (OuterVolumeSpecName: "inventory") pod "abcaa620-a9bf-4edf-a044-ea75ca9fa872" (UID: "abcaa620-a9bf-4edf-a044-ea75ca9fa872"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.849024 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "abcaa620-a9bf-4edf-a044-ea75ca9fa872" (UID: "abcaa620-a9bf-4edf-a044-ea75ca9fa872"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.928949 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2dmn\" (UniqueName: \"kubernetes.io/projected/abcaa620-a9bf-4edf-a044-ea75ca9fa872-kube-api-access-x2dmn\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.928975 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:10 crc kubenswrapper[4728]: I0125 06:03:10.928986 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abcaa620-a9bf-4edf-a044-ea75ca9fa872-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.530966 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" event={"ID":"abcaa620-a9bf-4edf-a044-ea75ca9fa872","Type":"ContainerDied","Data":"bddc9da1fe8a323268364ed644cd34bf825ab1a235379d6ba118d88264a913e4"} Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.531006 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bddc9da1fe8a323268364ed644cd34bf825ab1a235379d6ba118d88264a913e4" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.531012 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-665xk" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.601136 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts"] Jan 25 06:03:11 crc kubenswrapper[4728]: E0125 06:03:11.601839 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcaa620-a9bf-4edf-a044-ea75ca9fa872" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.601945 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcaa620-a9bf-4edf-a044-ea75ca9fa872" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 25 06:03:11 crc kubenswrapper[4728]: E0125 06:03:11.602044 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ca283-a737-488e-88a0-182eb5b88601" containerName="extract-content" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.602095 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ca283-a737-488e-88a0-182eb5b88601" containerName="extract-content" Jan 25 06:03:11 crc kubenswrapper[4728]: E0125 06:03:11.602158 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ca283-a737-488e-88a0-182eb5b88601" containerName="extract-utilities" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.602201 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ca283-a737-488e-88a0-182eb5b88601" containerName="extract-utilities" Jan 25 06:03:11 crc kubenswrapper[4728]: E0125 06:03:11.602252 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ca283-a737-488e-88a0-182eb5b88601" containerName="registry-server" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.602294 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ca283-a737-488e-88a0-182eb5b88601" containerName="registry-server" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.602654 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="265ca283-a737-488e-88a0-182eb5b88601" containerName="registry-server" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.602746 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="abcaa620-a9bf-4edf-a044-ea75ca9fa872" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.603638 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.605862 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.605886 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.606089 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.607912 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.609216 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts"] Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.638836 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.638889 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4jh\" (UniqueName: \"kubernetes.io/projected/022da454-0c7e-4950-9147-f13a2f725b47-kube-api-access-tl4jh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.639043 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.740980 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.741219 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4jh\" (UniqueName: \"kubernetes.io/projected/022da454-0c7e-4950-9147-f13a2f725b47-kube-api-access-tl4jh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.741368 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.744716 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.746210 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.755568 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4jh\" (UniqueName: \"kubernetes.io/projected/022da454-0c7e-4950-9147-f13a2f725b47-kube-api-access-tl4jh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:11 crc kubenswrapper[4728]: I0125 06:03:11.919936 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:03:12 crc kubenswrapper[4728]: I0125 06:03:12.360035 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts"] Jan 25 06:03:12 crc kubenswrapper[4728]: I0125 06:03:12.363201 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 06:03:12 crc kubenswrapper[4728]: I0125 06:03:12.540605 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" event={"ID":"022da454-0c7e-4950-9147-f13a2f725b47","Type":"ContainerStarted","Data":"72afca3416a1cb4f4ed4459ab92f5f680f2abcc26b22bcdbbed818df041b2561"} Jan 25 06:03:12 crc kubenswrapper[4728]: I0125 06:03:12.899713 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:03:12 crc kubenswrapper[4728]: I0125 06:03:12.899992 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:03:12 crc kubenswrapper[4728]: I0125 06:03:12.900051 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 06:03:12 crc kubenswrapper[4728]: I0125 06:03:12.900750 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a4f5085ca82c5966b307e41854c331b5cdb7a2b51d47df89db30b0d0eaf56ac"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 06:03:12 crc kubenswrapper[4728]: I0125 06:03:12.900816 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://0a4f5085ca82c5966b307e41854c331b5cdb7a2b51d47df89db30b0d0eaf56ac" gracePeriod=600 Jan 25 06:03:13 crc kubenswrapper[4728]: I0125 06:03:13.550495 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" event={"ID":"022da454-0c7e-4950-9147-f13a2f725b47","Type":"ContainerStarted","Data":"b6431ce75eb91940c0d21ca5848da8e9c7e3bded59efb29dcdd0bd047a949aac"} Jan 25 06:03:13 crc kubenswrapper[4728]: I0125 06:03:13.554925 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="0a4f5085ca82c5966b307e41854c331b5cdb7a2b51d47df89db30b0d0eaf56ac" exitCode=0 Jan 25 06:03:13 crc kubenswrapper[4728]: I0125 06:03:13.554972 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"0a4f5085ca82c5966b307e41854c331b5cdb7a2b51d47df89db30b0d0eaf56ac"} Jan 25 06:03:13 crc kubenswrapper[4728]: I0125 06:03:13.554990 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29"} Jan 25 06:03:13 crc kubenswrapper[4728]: I0125 06:03:13.555018 4728 scope.go:117] "RemoveContainer" containerID="fd316250bf57712586994889b62bcccbaedbf4eba29b23e84c2d634ac0c7e82a" Jan 25 06:03:13 crc kubenswrapper[4728]: I0125 06:03:13.567803 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" podStartSLOduration=2.011974825 podStartE2EDuration="2.567787095s" podCreationTimestamp="2026-01-25 06:03:11 +0000 UTC" firstStartedPulling="2026-01-25 06:03:12.362968753 +0000 UTC m=+1483.398846733" lastFinishedPulling="2026-01-25 06:03:12.918781023 +0000 UTC m=+1483.954659003" observedRunningTime="2026-01-25 06:03:13.563950141 +0000 UTC m=+1484.599828121" watchObservedRunningTime="2026-01-25 06:03:13.567787095 +0000 UTC m=+1484.603665076" Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.047129 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5c15-account-create-update-fnc86"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.057396 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8njbn"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.065737 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8e95-account-create-update-s4qm5"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.073601 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-16ae-account-create-update-mglnl"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.079970 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-q55tq"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.085944 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-98z6f"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.091893 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-16ae-account-create-update-mglnl"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.097524 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-98z6f"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.102981 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5c15-account-create-update-fnc86"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.108021 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8njbn"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.112983 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8e95-account-create-update-s4qm5"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.117905 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-q55tq"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.809805 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jn8pn"] Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.836882 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:28 crc kubenswrapper[4728]: I0125 06:03:28.838135 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn8pn"] Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.038015 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-catalog-content\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.038132 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-utilities\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.038203 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j5lc\" (UniqueName: \"kubernetes.io/projected/18ecbe67-7bfd-4e36-8166-14659f560d65-kube-api-access-8j5lc\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.139779 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-utilities\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.139886 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j5lc\" (UniqueName: \"kubernetes.io/projected/18ecbe67-7bfd-4e36-8166-14659f560d65-kube-api-access-8j5lc\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.139907 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-catalog-content\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.140437 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-catalog-content\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.140662 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-utilities\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.169295 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j5lc\" (UniqueName: \"kubernetes.io/projected/18ecbe67-7bfd-4e36-8166-14659f560d65-kube-api-access-8j5lc\") pod \"redhat-marketplace-jn8pn\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.338276 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c7493c-1258-4bdd-a009-525474fe9aed" path="/var/lib/kubelet/pods/11c7493c-1258-4bdd-a009-525474fe9aed/volumes" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.339469 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec75869-ff42-4ec7-b69b-da9d72fe052a" path="/var/lib/kubelet/pods/5ec75869-ff42-4ec7-b69b-da9d72fe052a/volumes" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.340353 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68bda3b6-c13a-4843-88be-c192ea6c8777" path="/var/lib/kubelet/pods/68bda3b6-c13a-4843-88be-c192ea6c8777/volumes" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.341090 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa32c669-499d-4df6-b58f-0fc9680ac7b2" path="/var/lib/kubelet/pods/aa32c669-499d-4df6-b58f-0fc9680ac7b2/volumes" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.343643 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b680d31f-ff1c-460a-8937-0f97023ba959" path="/var/lib/kubelet/pods/b680d31f-ff1c-460a-8937-0f97023ba959/volumes" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.344355 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ac425f-438a-4797-8689-86ca94810696" path="/var/lib/kubelet/pods/e8ac425f-438a-4797-8689-86ca94810696/volumes" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.467266 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:29 crc kubenswrapper[4728]: I0125 06:03:29.865293 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn8pn"] Jan 25 06:03:29 crc kubenswrapper[4728]: W0125 06:03:29.869738 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ecbe67_7bfd_4e36_8166_14659f560d65.slice/crio-be91d381818140f0c95cf598aadc6d96d6e4a93d173fdf3a5156e0842fa39833 WatchSource:0}: Error finding container be91d381818140f0c95cf598aadc6d96d6e4a93d173fdf3a5156e0842fa39833: Status 404 returned error can't find the container with id be91d381818140f0c95cf598aadc6d96d6e4a93d173fdf3a5156e0842fa39833 Jan 25 06:03:30 crc kubenswrapper[4728]: I0125 06:03:30.700776 4728 generic.go:334] "Generic (PLEG): container finished" podID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerID="5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f" exitCode=0 Jan 25 06:03:30 crc kubenswrapper[4728]: I0125 06:03:30.700834 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn8pn" event={"ID":"18ecbe67-7bfd-4e36-8166-14659f560d65","Type":"ContainerDied","Data":"5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f"} Jan 25 06:03:30 crc kubenswrapper[4728]: I0125 06:03:30.701107 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn8pn" event={"ID":"18ecbe67-7bfd-4e36-8166-14659f560d65","Type":"ContainerStarted","Data":"be91d381818140f0c95cf598aadc6d96d6e4a93d173fdf3a5156e0842fa39833"} Jan 25 06:03:31 crc kubenswrapper[4728]: I0125 06:03:31.710910 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn8pn" event={"ID":"18ecbe67-7bfd-4e36-8166-14659f560d65","Type":"ContainerStarted","Data":"911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d"} Jan 25 06:03:31 crc kubenswrapper[4728]: I0125 06:03:31.877844 4728 scope.go:117] "RemoveContainer" containerID="add62ed99e48fda7ce68c3d4cd1477e0e56f164479f6d0f212ca761517ea8a8a" Jan 25 06:03:31 crc kubenswrapper[4728]: I0125 06:03:31.897493 4728 scope.go:117] "RemoveContainer" containerID="6cd0b205a90e0434075ee8aedb18d7ff40f87510d38eeadfee691fef8a419ddf" Jan 25 06:03:31 crc kubenswrapper[4728]: I0125 06:03:31.934740 4728 scope.go:117] "RemoveContainer" containerID="290c131b92248c1600293b5d3a084e826fa55d32e2142e1bb68b9f56e37d7cad" Jan 25 06:03:31 crc kubenswrapper[4728]: I0125 06:03:31.978542 4728 scope.go:117] "RemoveContainer" containerID="160886ff4fbc54544041622e98bd3d08f4780e8146efe633ec126f29c2e4e724" Jan 25 06:03:32 crc kubenswrapper[4728]: I0125 06:03:32.008972 4728 scope.go:117] "RemoveContainer" containerID="3aed13db85c4bbd8d8fb9403d7cff155a0a52dd9df4a1b441ae146e03519eef2" Jan 25 06:03:32 crc kubenswrapper[4728]: I0125 06:03:32.051896 4728 scope.go:117] "RemoveContainer" containerID="33affe8305d05049193d8c76af3dd0ff87bfedc43a8ef45804f1ce64afde2817" Jan 25 06:03:32 crc kubenswrapper[4728]: I0125 06:03:32.730587 4728 generic.go:334] "Generic (PLEG): container finished" podID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerID="911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d" exitCode=0 Jan 25 06:03:32 crc kubenswrapper[4728]: I0125 06:03:32.730643 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn8pn" event={"ID":"18ecbe67-7bfd-4e36-8166-14659f560d65","Type":"ContainerDied","Data":"911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d"} Jan 25 06:03:33 crc kubenswrapper[4728]: I0125 06:03:33.029903 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7r9b2"] Jan 25 06:03:33 crc kubenswrapper[4728]: I0125 06:03:33.036573 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7r9b2"] Jan 25 06:03:33 crc kubenswrapper[4728]: I0125 06:03:33.337612 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eaf8418-c302-42cc-bd8e-a0797df4a1a1" path="/var/lib/kubelet/pods/4eaf8418-c302-42cc-bd8e-a0797df4a1a1/volumes" Jan 25 06:03:33 crc kubenswrapper[4728]: I0125 06:03:33.744116 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn8pn" event={"ID":"18ecbe67-7bfd-4e36-8166-14659f560d65","Type":"ContainerStarted","Data":"261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3"} Jan 25 06:03:33 crc kubenswrapper[4728]: I0125 06:03:33.763224 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jn8pn" podStartSLOduration=3.20915472 podStartE2EDuration="5.763201223s" podCreationTimestamp="2026-01-25 06:03:28 +0000 UTC" firstStartedPulling="2026-01-25 06:03:30.702559237 +0000 UTC m=+1501.738437217" lastFinishedPulling="2026-01-25 06:03:33.25660574 +0000 UTC m=+1504.292483720" observedRunningTime="2026-01-25 06:03:33.759449239 +0000 UTC m=+1504.795327219" watchObservedRunningTime="2026-01-25 06:03:33.763201223 +0000 UTC m=+1504.799079193" Jan 25 06:03:39 crc kubenswrapper[4728]: I0125 06:03:39.467464 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:39 crc kubenswrapper[4728]: I0125 06:03:39.468078 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:39 crc kubenswrapper[4728]: I0125 06:03:39.507729 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:39 crc kubenswrapper[4728]: I0125 06:03:39.825024 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:39 crc kubenswrapper[4728]: I0125 06:03:39.866262 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn8pn"] Jan 25 06:03:41 crc kubenswrapper[4728]: I0125 06:03:41.808972 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jn8pn" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerName="registry-server" containerID="cri-o://261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3" gracePeriod=2 Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.154905 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dl7zn"] Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.158977 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.163660 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dl7zn"] Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.184594 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbrh\" (UniqueName: \"kubernetes.io/projected/d197dae7-272c-40b0-b249-61abe37744cc-kube-api-access-9lbrh\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.184655 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-utilities\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.185164 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-catalog-content\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.264402 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.287365 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j5lc\" (UniqueName: \"kubernetes.io/projected/18ecbe67-7bfd-4e36-8166-14659f560d65-kube-api-access-8j5lc\") pod \"18ecbe67-7bfd-4e36-8166-14659f560d65\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.287419 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-catalog-content\") pod \"18ecbe67-7bfd-4e36-8166-14659f560d65\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.287471 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-utilities\") pod \"18ecbe67-7bfd-4e36-8166-14659f560d65\" (UID: \"18ecbe67-7bfd-4e36-8166-14659f560d65\") " Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.287615 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-catalog-content\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.287673 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbrh\" (UniqueName: \"kubernetes.io/projected/d197dae7-272c-40b0-b249-61abe37744cc-kube-api-access-9lbrh\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.287697 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-utilities\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.288180 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-utilities\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.289074 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-utilities" (OuterVolumeSpecName: "utilities") pod "18ecbe67-7bfd-4e36-8166-14659f560d65" (UID: "18ecbe67-7bfd-4e36-8166-14659f560d65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.289276 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-catalog-content\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.293391 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ecbe67-7bfd-4e36-8166-14659f560d65-kube-api-access-8j5lc" (OuterVolumeSpecName: "kube-api-access-8j5lc") pod "18ecbe67-7bfd-4e36-8166-14659f560d65" (UID: "18ecbe67-7bfd-4e36-8166-14659f560d65"). InnerVolumeSpecName "kube-api-access-8j5lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.302041 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbrh\" (UniqueName: \"kubernetes.io/projected/d197dae7-272c-40b0-b249-61abe37744cc-kube-api-access-9lbrh\") pod \"certified-operators-dl7zn\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.308039 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18ecbe67-7bfd-4e36-8166-14659f560d65" (UID: "18ecbe67-7bfd-4e36-8166-14659f560d65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.388611 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.388639 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j5lc\" (UniqueName: \"kubernetes.io/projected/18ecbe67-7bfd-4e36-8166-14659f560d65-kube-api-access-8j5lc\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.388650 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ecbe67-7bfd-4e36-8166-14659f560d65-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.484619 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.820165 4728 generic.go:334] "Generic (PLEG): container finished" podID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerID="261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3" exitCode=0 Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.820218 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn8pn" event={"ID":"18ecbe67-7bfd-4e36-8166-14659f560d65","Type":"ContainerDied","Data":"261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3"} Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.820545 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jn8pn" event={"ID":"18ecbe67-7bfd-4e36-8166-14659f560d65","Type":"ContainerDied","Data":"be91d381818140f0c95cf598aadc6d96d6e4a93d173fdf3a5156e0842fa39833"} Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.820573 4728 scope.go:117] "RemoveContainer" containerID="261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.820287 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jn8pn" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.845268 4728 scope.go:117] "RemoveContainer" containerID="911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.858368 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn8pn"] Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.872180 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jn8pn"] Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.884437 4728 scope.go:117] "RemoveContainer" containerID="5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.911446 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dl7zn"] Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.926014 4728 scope.go:117] "RemoveContainer" containerID="261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3" Jan 25 06:03:42 crc kubenswrapper[4728]: E0125 06:03:42.935512 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3\": container with ID starting with 261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3 not found: ID does not exist" containerID="261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.935561 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3"} err="failed to get container status \"261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3\": rpc error: code = NotFound desc = could not find container \"261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3\": container with ID starting with 261bd91cef86c9547c9212ebf18d9ce8819dedcc63d7c0f70754a408cce18ff3 not found: ID does not exist" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.935593 4728 scope.go:117] "RemoveContainer" containerID="911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d" Jan 25 06:03:42 crc kubenswrapper[4728]: E0125 06:03:42.935980 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d\": container with ID starting with 911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d not found: ID does not exist" containerID="911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.936004 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d"} err="failed to get container status \"911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d\": rpc error: code = NotFound desc = could not find container \"911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d\": container with ID starting with 911b860002188caa33eca923159dda9826a0304b92797dc0113743e181d34f1d not found: ID does not exist" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.936089 4728 scope.go:117] "RemoveContainer" containerID="5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f" Jan 25 06:03:42 crc kubenswrapper[4728]: E0125 06:03:42.937137 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f\": container with ID starting with 5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f not found: ID does not exist" containerID="5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f" Jan 25 06:03:42 crc kubenswrapper[4728]: I0125 06:03:42.937204 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f"} err="failed to get container status \"5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f\": rpc error: code = NotFound desc = could not find container \"5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f\": container with ID starting with 5c50343b911e5296d758a17acc71ea290d558b6110dba04c00906e49eb96493f not found: ID does not exist" Jan 25 06:03:43 crc kubenswrapper[4728]: I0125 06:03:43.340060 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" path="/var/lib/kubelet/pods/18ecbe67-7bfd-4e36-8166-14659f560d65/volumes" Jan 25 06:03:43 crc kubenswrapper[4728]: I0125 06:03:43.832830 4728 generic.go:334] "Generic (PLEG): container finished" podID="d197dae7-272c-40b0-b249-61abe37744cc" containerID="c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6" exitCode=0 Jan 25 06:03:43 crc kubenswrapper[4728]: I0125 06:03:43.832891 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dl7zn" event={"ID":"d197dae7-272c-40b0-b249-61abe37744cc","Type":"ContainerDied","Data":"c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6"} Jan 25 06:03:43 crc kubenswrapper[4728]: I0125 06:03:43.833149 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dl7zn" event={"ID":"d197dae7-272c-40b0-b249-61abe37744cc","Type":"ContainerStarted","Data":"3fc0e76809b2cdc6ee9229bfaf701ad12ba097a7e85fe03ec0bf7d2adc712a02"} Jan 25 06:03:44 crc kubenswrapper[4728]: I0125 06:03:44.847446 4728 generic.go:334] "Generic (PLEG): container finished" podID="d197dae7-272c-40b0-b249-61abe37744cc" containerID="2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b" exitCode=0 Jan 25 06:03:44 crc kubenswrapper[4728]: I0125 06:03:44.847520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dl7zn" event={"ID":"d197dae7-272c-40b0-b249-61abe37744cc","Type":"ContainerDied","Data":"2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b"} Jan 25 06:03:45 crc kubenswrapper[4728]: I0125 06:03:45.859357 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dl7zn" event={"ID":"d197dae7-272c-40b0-b249-61abe37744cc","Type":"ContainerStarted","Data":"a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310"} Jan 25 06:03:45 crc kubenswrapper[4728]: I0125 06:03:45.885111 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dl7zn" podStartSLOduration=2.350310483 podStartE2EDuration="3.885098078s" podCreationTimestamp="2026-01-25 06:03:42 +0000 UTC" firstStartedPulling="2026-01-25 06:03:43.835229171 +0000 UTC m=+1514.871107150" lastFinishedPulling="2026-01-25 06:03:45.370016764 +0000 UTC m=+1516.405894745" observedRunningTime="2026-01-25 06:03:45.877955331 +0000 UTC m=+1516.913833311" watchObservedRunningTime="2026-01-25 06:03:45.885098078 +0000 UTC m=+1516.920976059" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.519123 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jprs2"] Jan 25 06:03:49 crc kubenswrapper[4728]: E0125 06:03:49.520557 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerName="extract-utilities" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.520575 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerName="extract-utilities" Jan 25 06:03:49 crc kubenswrapper[4728]: E0125 06:03:49.520584 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerName="registry-server" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.520592 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerName="registry-server" Jan 25 06:03:49 crc kubenswrapper[4728]: E0125 06:03:49.520606 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerName="extract-content" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.520614 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerName="extract-content" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.520917 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ecbe67-7bfd-4e36-8166-14659f560d65" containerName="registry-server" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.522643 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.528770 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jprs2"] Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.539807 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-catalog-content\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.539901 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgz4f\" (UniqueName: \"kubernetes.io/projected/b8b0f89a-1c47-42e6-adde-581e1013dba6-kube-api-access-bgz4f\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.539961 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-utilities\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.641975 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-catalog-content\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.642045 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgz4f\" (UniqueName: \"kubernetes.io/projected/b8b0f89a-1c47-42e6-adde-581e1013dba6-kube-api-access-bgz4f\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.642087 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-utilities\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.642565 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-catalog-content\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.642589 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-utilities\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.659017 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgz4f\" (UniqueName: \"kubernetes.io/projected/b8b0f89a-1c47-42e6-adde-581e1013dba6-kube-api-access-bgz4f\") pod \"community-operators-jprs2\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:49 crc kubenswrapper[4728]: I0125 06:03:49.846282 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:50 crc kubenswrapper[4728]: I0125 06:03:50.313784 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jprs2"] Jan 25 06:03:50 crc kubenswrapper[4728]: W0125 06:03:50.315199 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b0f89a_1c47_42e6_adde_581e1013dba6.slice/crio-1656aedebfd4b34757d4027183cdd3a78455821bee839aefe9b09fb56da9d094 WatchSource:0}: Error finding container 1656aedebfd4b34757d4027183cdd3a78455821bee839aefe9b09fb56da9d094: Status 404 returned error can't find the container with id 1656aedebfd4b34757d4027183cdd3a78455821bee839aefe9b09fb56da9d094 Jan 25 06:03:50 crc kubenswrapper[4728]: I0125 06:03:50.913577 4728 generic.go:334] "Generic (PLEG): container finished" podID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerID="34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b" exitCode=0 Jan 25 06:03:50 crc kubenswrapper[4728]: I0125 06:03:50.913702 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprs2" event={"ID":"b8b0f89a-1c47-42e6-adde-581e1013dba6","Type":"ContainerDied","Data":"34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b"} Jan 25 06:03:50 crc kubenswrapper[4728]: I0125 06:03:50.913966 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprs2" event={"ID":"b8b0f89a-1c47-42e6-adde-581e1013dba6","Type":"ContainerStarted","Data":"1656aedebfd4b34757d4027183cdd3a78455821bee839aefe9b09fb56da9d094"} Jan 25 06:03:51 crc kubenswrapper[4728]: I0125 06:03:51.925824 4728 generic.go:334] "Generic (PLEG): container finished" podID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerID="37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d" exitCode=0 Jan 25 06:03:51 crc kubenswrapper[4728]: I0125 06:03:51.925935 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprs2" event={"ID":"b8b0f89a-1c47-42e6-adde-581e1013dba6","Type":"ContainerDied","Data":"37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d"} Jan 25 06:03:52 crc kubenswrapper[4728]: I0125 06:03:52.485658 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:52 crc kubenswrapper[4728]: I0125 06:03:52.486649 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:52 crc kubenswrapper[4728]: I0125 06:03:52.535313 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:52 crc kubenswrapper[4728]: I0125 06:03:52.937602 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprs2" event={"ID":"b8b0f89a-1c47-42e6-adde-581e1013dba6","Type":"ContainerStarted","Data":"ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e"} Jan 25 06:03:52 crc kubenswrapper[4728]: I0125 06:03:52.963266 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jprs2" podStartSLOduration=2.462427128 podStartE2EDuration="3.963242009s" podCreationTimestamp="2026-01-25 06:03:49 +0000 UTC" firstStartedPulling="2026-01-25 06:03:50.916343101 +0000 UTC m=+1521.952221080" lastFinishedPulling="2026-01-25 06:03:52.417157982 +0000 UTC m=+1523.453035961" observedRunningTime="2026-01-25 06:03:52.953379624 +0000 UTC m=+1523.989257604" watchObservedRunningTime="2026-01-25 06:03:52.963242009 +0000 UTC m=+1523.999119989" Jan 25 06:03:52 crc kubenswrapper[4728]: I0125 06:03:52.977550 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.034406 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kq85j"] Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.046144 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dllgm"] Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.054458 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-704e-account-create-update-9vw4c"] Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.061444 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dllgm"] Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.066836 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kq85j"] Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.072153 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-704e-account-create-update-9vw4c"] Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.338576 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0995cb23-3429-4757-95d1-7f48216b7dce" path="/var/lib/kubelet/pods/0995cb23-3429-4757-95d1-7f48216b7dce/volumes" Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.339502 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be31ea3f-183c-4282-bf47-2fc7e17ab5b0" path="/var/lib/kubelet/pods/be31ea3f-183c-4282-bf47-2fc7e17ab5b0/volumes" Jan 25 06:03:53 crc kubenswrapper[4728]: I0125 06:03:53.340019 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d022c478-e9a3-40d9-b37e-54d2ba1e5ba1" path="/var/lib/kubelet/pods/d022c478-e9a3-40d9-b37e-54d2ba1e5ba1/volumes" Jan 25 06:03:54 crc kubenswrapper[4728]: I0125 06:03:54.023453 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0621-account-create-update-6q629"] Jan 25 06:03:54 crc kubenswrapper[4728]: I0125 06:03:54.036896 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b6e8-account-create-update-pg7cx"] Jan 25 06:03:54 crc kubenswrapper[4728]: I0125 06:03:54.044530 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qh5qp"] Jan 25 06:03:54 crc kubenswrapper[4728]: I0125 06:03:54.051614 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0621-account-create-update-6q629"] Jan 25 06:03:54 crc kubenswrapper[4728]: I0125 06:03:54.058136 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b6e8-account-create-update-pg7cx"] Jan 25 06:03:54 crc kubenswrapper[4728]: I0125 06:03:54.063666 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qh5qp"] Jan 25 06:03:54 crc kubenswrapper[4728]: I0125 06:03:54.899943 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dl7zn"] Jan 25 06:03:55 crc kubenswrapper[4728]: I0125 06:03:55.343267 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a53ae7b-d679-4ae7-a6c6-d3465781c613" path="/var/lib/kubelet/pods/6a53ae7b-d679-4ae7-a6c6-d3465781c613/volumes" Jan 25 06:03:55 crc kubenswrapper[4728]: I0125 06:03:55.343926 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7319f4f1-86d5-4681-ba6c-012c0f3039ac" path="/var/lib/kubelet/pods/7319f4f1-86d5-4681-ba6c-012c0f3039ac/volumes" Jan 25 06:03:55 crc kubenswrapper[4728]: I0125 06:03:55.344503 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803f99c7-af4a-4c8a-99ac-42a58563c3d2" path="/var/lib/kubelet/pods/803f99c7-af4a-4c8a-99ac-42a58563c3d2/volumes" Jan 25 06:03:55 crc kubenswrapper[4728]: I0125 06:03:55.979174 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dl7zn" podUID="d197dae7-272c-40b0-b249-61abe37744cc" containerName="registry-server" containerID="cri-o://a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310" gracePeriod=2 Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.394688 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.581248 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-utilities\") pod \"d197dae7-272c-40b0-b249-61abe37744cc\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.581287 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-catalog-content\") pod \"d197dae7-272c-40b0-b249-61abe37744cc\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.581376 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lbrh\" (UniqueName: \"kubernetes.io/projected/d197dae7-272c-40b0-b249-61abe37744cc-kube-api-access-9lbrh\") pod \"d197dae7-272c-40b0-b249-61abe37744cc\" (UID: \"d197dae7-272c-40b0-b249-61abe37744cc\") " Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.582151 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-utilities" (OuterVolumeSpecName: "utilities") pod "d197dae7-272c-40b0-b249-61abe37744cc" (UID: "d197dae7-272c-40b0-b249-61abe37744cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.583297 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.587402 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d197dae7-272c-40b0-b249-61abe37744cc-kube-api-access-9lbrh" (OuterVolumeSpecName: "kube-api-access-9lbrh") pod "d197dae7-272c-40b0-b249-61abe37744cc" (UID: "d197dae7-272c-40b0-b249-61abe37744cc"). InnerVolumeSpecName "kube-api-access-9lbrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.619587 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d197dae7-272c-40b0-b249-61abe37744cc" (UID: "d197dae7-272c-40b0-b249-61abe37744cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.685435 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d197dae7-272c-40b0-b249-61abe37744cc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.685472 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lbrh\" (UniqueName: \"kubernetes.io/projected/d197dae7-272c-40b0-b249-61abe37744cc-kube-api-access-9lbrh\") on node \"crc\" DevicePath \"\"" Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.991853 4728 generic.go:334] "Generic (PLEG): container finished" podID="d197dae7-272c-40b0-b249-61abe37744cc" containerID="a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310" exitCode=0 Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.991946 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dl7zn" Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.991962 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dl7zn" event={"ID":"d197dae7-272c-40b0-b249-61abe37744cc","Type":"ContainerDied","Data":"a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310"} Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.992244 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dl7zn" event={"ID":"d197dae7-272c-40b0-b249-61abe37744cc","Type":"ContainerDied","Data":"3fc0e76809b2cdc6ee9229bfaf701ad12ba097a7e85fe03ec0bf7d2adc712a02"} Jan 25 06:03:56 crc kubenswrapper[4728]: I0125 06:03:56.992271 4728 scope.go:117] "RemoveContainer" containerID="a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.013849 4728 scope.go:117] "RemoveContainer" containerID="2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.022597 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dl7zn"] Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.037597 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dl7zn"] Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.045570 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-w8ld9"] Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.053302 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-w8ld9"] Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.054824 4728 scope.go:117] "RemoveContainer" containerID="c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.075257 4728 scope.go:117] "RemoveContainer" containerID="a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310" Jan 25 06:03:57 crc kubenswrapper[4728]: E0125 06:03:57.075704 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310\": container with ID starting with a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310 not found: ID does not exist" containerID="a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.075754 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310"} err="failed to get container status \"a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310\": rpc error: code = NotFound desc = could not find container \"a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310\": container with ID starting with a1ee088a3621115d7672d197649785a0e07a9d6edb10b2f7b2225d90ec1e1310 not found: ID does not exist" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.075790 4728 scope.go:117] "RemoveContainer" containerID="2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b" Jan 25 06:03:57 crc kubenswrapper[4728]: E0125 06:03:57.076394 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b\": container with ID starting with 2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b not found: ID does not exist" containerID="2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.076456 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b"} err="failed to get container status \"2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b\": rpc error: code = NotFound desc = could not find container \"2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b\": container with ID starting with 2bec32c50e09d1c13619b656111195d057685ccb00606b49f6e0fa150e2d744b not found: ID does not exist" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.076515 4728 scope.go:117] "RemoveContainer" containerID="c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6" Jan 25 06:03:57 crc kubenswrapper[4728]: E0125 06:03:57.076889 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6\": container with ID starting with c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6 not found: ID does not exist" containerID="c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.076926 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6"} err="failed to get container status \"c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6\": rpc error: code = NotFound desc = could not find container \"c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6\": container with ID starting with c67bd95dc64891093fbcec5a61cee32696e3869ee8da6ab9bcb99133a06664c6 not found: ID does not exist" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.338748 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d197dae7-272c-40b0-b249-61abe37744cc" path="/var/lib/kubelet/pods/d197dae7-272c-40b0-b249-61abe37744cc/volumes" Jan 25 06:03:57 crc kubenswrapper[4728]: I0125 06:03:57.339534 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db13ac9b-f01e-42d8-b455-db929ef4b64c" path="/var/lib/kubelet/pods/db13ac9b-f01e-42d8-b455-db929ef4b64c/volumes" Jan 25 06:03:59 crc kubenswrapper[4728]: I0125 06:03:59.846858 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:59 crc kubenswrapper[4728]: I0125 06:03:59.847605 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:03:59 crc kubenswrapper[4728]: I0125 06:03:59.886258 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:04:00 crc kubenswrapper[4728]: I0125 06:04:00.072023 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:04:01 crc kubenswrapper[4728]: I0125 06:04:01.102421 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jprs2"] Jan 25 06:04:02 crc kubenswrapper[4728]: I0125 06:04:02.034601 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-r67j5"] Jan 25 06:04:02 crc kubenswrapper[4728]: I0125 06:04:02.042936 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-r67j5"] Jan 25 06:04:02 crc kubenswrapper[4728]: I0125 06:04:02.047042 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jprs2" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerName="registry-server" containerID="cri-o://ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e" gracePeriod=2 Jan 25 06:04:02 crc kubenswrapper[4728]: I0125 06:04:02.958269 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.006457 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-catalog-content\") pod \"b8b0f89a-1c47-42e6-adde-581e1013dba6\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.006522 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgz4f\" (UniqueName: \"kubernetes.io/projected/b8b0f89a-1c47-42e6-adde-581e1013dba6-kube-api-access-bgz4f\") pod \"b8b0f89a-1c47-42e6-adde-581e1013dba6\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.006678 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-utilities\") pod \"b8b0f89a-1c47-42e6-adde-581e1013dba6\" (UID: \"b8b0f89a-1c47-42e6-adde-581e1013dba6\") " Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.007399 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-utilities" (OuterVolumeSpecName: "utilities") pod "b8b0f89a-1c47-42e6-adde-581e1013dba6" (UID: "b8b0f89a-1c47-42e6-adde-581e1013dba6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.008149 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.012976 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b0f89a-1c47-42e6-adde-581e1013dba6-kube-api-access-bgz4f" (OuterVolumeSpecName: "kube-api-access-bgz4f") pod "b8b0f89a-1c47-42e6-adde-581e1013dba6" (UID: "b8b0f89a-1c47-42e6-adde-581e1013dba6"). InnerVolumeSpecName "kube-api-access-bgz4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.044747 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8b0f89a-1c47-42e6-adde-581e1013dba6" (UID: "b8b0f89a-1c47-42e6-adde-581e1013dba6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.056596 4728 generic.go:334] "Generic (PLEG): container finished" podID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerID="ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e" exitCode=0 Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.056641 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprs2" event={"ID":"b8b0f89a-1c47-42e6-adde-581e1013dba6","Type":"ContainerDied","Data":"ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e"} Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.056675 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprs2" event={"ID":"b8b0f89a-1c47-42e6-adde-581e1013dba6","Type":"ContainerDied","Data":"1656aedebfd4b34757d4027183cdd3a78455821bee839aefe9b09fb56da9d094"} Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.056692 4728 scope.go:117] "RemoveContainer" containerID="ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.056840 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jprs2" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.086290 4728 scope.go:117] "RemoveContainer" containerID="37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.097487 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jprs2"] Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.108695 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jprs2"] Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.109205 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b0f89a-1c47-42e6-adde-581e1013dba6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.109239 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgz4f\" (UniqueName: \"kubernetes.io/projected/b8b0f89a-1c47-42e6-adde-581e1013dba6-kube-api-access-bgz4f\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.122132 4728 scope.go:117] "RemoveContainer" containerID="34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.144868 4728 scope.go:117] "RemoveContainer" containerID="ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e" Jan 25 06:04:03 crc kubenswrapper[4728]: E0125 06:04:03.145468 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e\": container with ID starting with ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e not found: ID does not exist" containerID="ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.145520 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e"} err="failed to get container status \"ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e\": rpc error: code = NotFound desc = could not find container \"ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e\": container with ID starting with ba00a7a769f59aa9a33698e0533fbe082c7876f2258104ca6158a94b7daa415e not found: ID does not exist" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.145567 4728 scope.go:117] "RemoveContainer" containerID="37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d" Jan 25 06:04:03 crc kubenswrapper[4728]: E0125 06:04:03.145907 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d\": container with ID starting with 37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d not found: ID does not exist" containerID="37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.145932 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d"} err="failed to get container status \"37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d\": rpc error: code = NotFound desc = could not find container \"37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d\": container with ID starting with 37e2a326633568d34e63a697a7e4c2d2fec2aaadf39160661f4dbcad2b47969d not found: ID does not exist" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.145949 4728 scope.go:117] "RemoveContainer" containerID="34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b" Jan 25 06:04:03 crc kubenswrapper[4728]: E0125 06:04:03.146294 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b\": container with ID starting with 34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b not found: ID does not exist" containerID="34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.146368 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b"} err="failed to get container status \"34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b\": rpc error: code = NotFound desc = could not find container \"34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b\": container with ID starting with 34ef04fefb6293d5a7e58b5cbdd608478ca814c94c834d2a4467e229062a142b not found: ID does not exist" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.338690 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b9902f-5c07-4a73-8a08-0a1c28e09fd8" path="/var/lib/kubelet/pods/00b9902f-5c07-4a73-8a08-0a1c28e09fd8/volumes" Jan 25 06:04:03 crc kubenswrapper[4728]: I0125 06:04:03.339226 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" path="/var/lib/kubelet/pods/b8b0f89a-1c47-42e6-adde-581e1013dba6/volumes" Jan 25 06:04:05 crc kubenswrapper[4728]: I0125 06:04:05.080578 4728 generic.go:334] "Generic (PLEG): container finished" podID="022da454-0c7e-4950-9147-f13a2f725b47" containerID="b6431ce75eb91940c0d21ca5848da8e9c7e3bded59efb29dcdd0bd047a949aac" exitCode=0 Jan 25 06:04:05 crc kubenswrapper[4728]: I0125 06:04:05.080665 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" event={"ID":"022da454-0c7e-4950-9147-f13a2f725b47","Type":"ContainerDied","Data":"b6431ce75eb91940c0d21ca5848da8e9c7e3bded59efb29dcdd0bd047a949aac"} Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.474640 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.677070 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-inventory\") pod \"022da454-0c7e-4950-9147-f13a2f725b47\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.677155 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl4jh\" (UniqueName: \"kubernetes.io/projected/022da454-0c7e-4950-9147-f13a2f725b47-kube-api-access-tl4jh\") pod \"022da454-0c7e-4950-9147-f13a2f725b47\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.677248 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-ssh-key-openstack-edpm-ipam\") pod \"022da454-0c7e-4950-9147-f13a2f725b47\" (UID: \"022da454-0c7e-4950-9147-f13a2f725b47\") " Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.684297 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/022da454-0c7e-4950-9147-f13a2f725b47-kube-api-access-tl4jh" (OuterVolumeSpecName: "kube-api-access-tl4jh") pod "022da454-0c7e-4950-9147-f13a2f725b47" (UID: "022da454-0c7e-4950-9147-f13a2f725b47"). InnerVolumeSpecName "kube-api-access-tl4jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.703947 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "022da454-0c7e-4950-9147-f13a2f725b47" (UID: "022da454-0c7e-4950-9147-f13a2f725b47"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.705840 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-inventory" (OuterVolumeSpecName: "inventory") pod "022da454-0c7e-4950-9147-f13a2f725b47" (UID: "022da454-0c7e-4950-9147-f13a2f725b47"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.780548 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.780606 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/022da454-0c7e-4950-9147-f13a2f725b47-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:06 crc kubenswrapper[4728]: I0125 06:04:06.780619 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl4jh\" (UniqueName: \"kubernetes.io/projected/022da454-0c7e-4950-9147-f13a2f725b47-kube-api-access-tl4jh\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.103642 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" event={"ID":"022da454-0c7e-4950-9147-f13a2f725b47","Type":"ContainerDied","Data":"72afca3416a1cb4f4ed4459ab92f5f680f2abcc26b22bcdbbed818df041b2561"} Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.103688 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72afca3416a1cb4f4ed4459ab92f5f680f2abcc26b22bcdbbed818df041b2561" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.103746 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164101 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf"] Jan 25 06:04:07 crc kubenswrapper[4728]: E0125 06:04:07.164556 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d197dae7-272c-40b0-b249-61abe37744cc" containerName="registry-server" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164575 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d197dae7-272c-40b0-b249-61abe37744cc" containerName="registry-server" Jan 25 06:04:07 crc kubenswrapper[4728]: E0125 06:04:07.164591 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerName="registry-server" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164598 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerName="registry-server" Jan 25 06:04:07 crc kubenswrapper[4728]: E0125 06:04:07.164607 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerName="extract-utilities" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164615 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerName="extract-utilities" Jan 25 06:04:07 crc kubenswrapper[4728]: E0125 06:04:07.164626 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d197dae7-272c-40b0-b249-61abe37744cc" containerName="extract-utilities" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164633 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d197dae7-272c-40b0-b249-61abe37744cc" containerName="extract-utilities" Jan 25 06:04:07 crc kubenswrapper[4728]: E0125 06:04:07.164649 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="022da454-0c7e-4950-9147-f13a2f725b47" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164659 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="022da454-0c7e-4950-9147-f13a2f725b47" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:07 crc kubenswrapper[4728]: E0125 06:04:07.164676 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d197dae7-272c-40b0-b249-61abe37744cc" containerName="extract-content" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164682 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d197dae7-272c-40b0-b249-61abe37744cc" containerName="extract-content" Jan 25 06:04:07 crc kubenswrapper[4728]: E0125 06:04:07.164692 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerName="extract-content" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164699 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerName="extract-content" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164890 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b0f89a-1c47-42e6-adde-581e1013dba6" containerName="registry-server" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164905 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="022da454-0c7e-4950-9147-f13a2f725b47" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.164926 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d197dae7-272c-40b0-b249-61abe37744cc" containerName="registry-server" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.165597 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.169920 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.170195 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.170333 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.170937 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.173697 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf"] Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.187557 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.187716 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.187751 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps66c\" (UniqueName: \"kubernetes.io/projected/11ed206c-89ec-40be-ad2e-6217031ce033-kube-api-access-ps66c\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.289603 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.289668 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps66c\" (UniqueName: \"kubernetes.io/projected/11ed206c-89ec-40be-ad2e-6217031ce033-kube-api-access-ps66c\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.289740 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.293532 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.293532 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.304655 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps66c\" (UniqueName: \"kubernetes.io/projected/11ed206c-89ec-40be-ad2e-6217031ce033-kube-api-access-ps66c\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.479391 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:07 crc kubenswrapper[4728]: I0125 06:04:07.960417 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf"] Jan 25 06:04:08 crc kubenswrapper[4728]: I0125 06:04:08.112342 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" event={"ID":"11ed206c-89ec-40be-ad2e-6217031ce033","Type":"ContainerStarted","Data":"60ce7b8b6b2f04117ac5eae8c6dbe6832e6d2f3f3a58ec452422894f2dd4f191"} Jan 25 06:04:09 crc kubenswrapper[4728]: I0125 06:04:09.124878 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" event={"ID":"11ed206c-89ec-40be-ad2e-6217031ce033","Type":"ContainerStarted","Data":"418280eab818d7fbe7670ab4ed6c4dad4875bea8ef215d56fc1893ba1ccced25"} Jan 25 06:04:12 crc kubenswrapper[4728]: I0125 06:04:12.150382 4728 generic.go:334] "Generic (PLEG): container finished" podID="11ed206c-89ec-40be-ad2e-6217031ce033" containerID="418280eab818d7fbe7670ab4ed6c4dad4875bea8ef215d56fc1893ba1ccced25" exitCode=0 Jan 25 06:04:12 crc kubenswrapper[4728]: I0125 06:04:12.150456 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" event={"ID":"11ed206c-89ec-40be-ad2e-6217031ce033","Type":"ContainerDied","Data":"418280eab818d7fbe7670ab4ed6c4dad4875bea8ef215d56fc1893ba1ccced25"} Jan 25 06:04:13 crc kubenswrapper[4728]: I0125 06:04:13.506044 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:13 crc kubenswrapper[4728]: I0125 06:04:13.612222 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-ssh-key-openstack-edpm-ipam\") pod \"11ed206c-89ec-40be-ad2e-6217031ce033\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " Jan 25 06:04:13 crc kubenswrapper[4728]: I0125 06:04:13.612332 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory\") pod \"11ed206c-89ec-40be-ad2e-6217031ce033\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " Jan 25 06:04:13 crc kubenswrapper[4728]: I0125 06:04:13.612365 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps66c\" (UniqueName: \"kubernetes.io/projected/11ed206c-89ec-40be-ad2e-6217031ce033-kube-api-access-ps66c\") pod \"11ed206c-89ec-40be-ad2e-6217031ce033\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " Jan 25 06:04:13 crc kubenswrapper[4728]: I0125 06:04:13.618853 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ed206c-89ec-40be-ad2e-6217031ce033-kube-api-access-ps66c" (OuterVolumeSpecName: "kube-api-access-ps66c") pod "11ed206c-89ec-40be-ad2e-6217031ce033" (UID: "11ed206c-89ec-40be-ad2e-6217031ce033"). InnerVolumeSpecName "kube-api-access-ps66c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:04:13 crc kubenswrapper[4728]: E0125 06:04:13.635267 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory podName:11ed206c-89ec-40be-ad2e-6217031ce033 nodeName:}" failed. No retries permitted until 2026-01-25 06:04:14.135232793 +0000 UTC m=+1545.171110773 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory") pod "11ed206c-89ec-40be-ad2e-6217031ce033" (UID: "11ed206c-89ec-40be-ad2e-6217031ce033") : error deleting /var/lib/kubelet/pods/11ed206c-89ec-40be-ad2e-6217031ce033/volume-subpaths: remove /var/lib/kubelet/pods/11ed206c-89ec-40be-ad2e-6217031ce033/volume-subpaths: no such file or directory Jan 25 06:04:13 crc kubenswrapper[4728]: I0125 06:04:13.637145 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "11ed206c-89ec-40be-ad2e-6217031ce033" (UID: "11ed206c-89ec-40be-ad2e-6217031ce033"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:04:13 crc kubenswrapper[4728]: I0125 06:04:13.717189 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:13 crc kubenswrapper[4728]: I0125 06:04:13.717376 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps66c\" (UniqueName: \"kubernetes.io/projected/11ed206c-89ec-40be-ad2e-6217031ce033-kube-api-access-ps66c\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.169059 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" event={"ID":"11ed206c-89ec-40be-ad2e-6217031ce033","Type":"ContainerDied","Data":"60ce7b8b6b2f04117ac5eae8c6dbe6832e6d2f3f3a58ec452422894f2dd4f191"} Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.169285 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ce7b8b6b2f04117ac5eae8c6dbe6832e6d2f3f3a58ec452422894f2dd4f191" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.169112 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.216297 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk"] Jan 25 06:04:14 crc kubenswrapper[4728]: E0125 06:04:14.216836 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ed206c-89ec-40be-ad2e-6217031ce033" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.216856 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ed206c-89ec-40be-ad2e-6217031ce033" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.217026 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ed206c-89ec-40be-ad2e-6217031ce033" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.217764 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.224766 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk"] Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.225983 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory\") pod \"11ed206c-89ec-40be-ad2e-6217031ce033\" (UID: \"11ed206c-89ec-40be-ad2e-6217031ce033\") " Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.226467 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.226907 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.227171 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khwb8\" (UniqueName: \"kubernetes.io/projected/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-kube-api-access-khwb8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.230222 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory" (OuterVolumeSpecName: "inventory") pod "11ed206c-89ec-40be-ad2e-6217031ce033" (UID: "11ed206c-89ec-40be-ad2e-6217031ce033"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.329684 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.330453 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khwb8\" (UniqueName: \"kubernetes.io/projected/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-kube-api-access-khwb8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.330607 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.330738 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ed206c-89ec-40be-ad2e-6217031ce033-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.333284 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.334442 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.345830 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khwb8\" (UniqueName: \"kubernetes.io/projected/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-kube-api-access-khwb8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qx5hk\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:14 crc kubenswrapper[4728]: I0125 06:04:14.532882 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:15 crc kubenswrapper[4728]: I0125 06:04:15.006669 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk"] Jan 25 06:04:15 crc kubenswrapper[4728]: I0125 06:04:15.177736 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" event={"ID":"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d","Type":"ContainerStarted","Data":"92ab597371939a97701918612882ae4233a8fec320078a050bd1a698f814f1b6"} Jan 25 06:04:16 crc kubenswrapper[4728]: I0125 06:04:16.028466 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-pr68r"] Jan 25 06:04:16 crc kubenswrapper[4728]: I0125 06:04:16.037213 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-pr68r"] Jan 25 06:04:16 crc kubenswrapper[4728]: I0125 06:04:16.049398 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nv6tr"] Jan 25 06:04:16 crc kubenswrapper[4728]: I0125 06:04:16.056593 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nv6tr"] Jan 25 06:04:16 crc kubenswrapper[4728]: I0125 06:04:16.186432 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" event={"ID":"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d","Type":"ContainerStarted","Data":"6a869fb7ea18189c8eeaa5e46549537eaad9d2637ccd4d665fa08057336a6136"} Jan 25 06:04:16 crc kubenswrapper[4728]: I0125 06:04:16.206289 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" podStartSLOduration=1.735287444 podStartE2EDuration="2.206271904s" podCreationTimestamp="2026-01-25 06:04:14 +0000 UTC" firstStartedPulling="2026-01-25 06:04:15.007547673 +0000 UTC m=+1546.043425653" lastFinishedPulling="2026-01-25 06:04:15.478532133 +0000 UTC m=+1546.514410113" observedRunningTime="2026-01-25 06:04:16.19831438 +0000 UTC m=+1547.234192360" watchObservedRunningTime="2026-01-25 06:04:16.206271904 +0000 UTC m=+1547.242149885" Jan 25 06:04:17 crc kubenswrapper[4728]: I0125 06:04:17.337394 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="019c7f41-0990-4235-b29d-8d8e08d34af1" path="/var/lib/kubelet/pods/019c7f41-0990-4235-b29d-8d8e08d34af1/volumes" Jan 25 06:04:17 crc kubenswrapper[4728]: I0125 06:04:17.338103 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d85a5b2-cb44-4190-973a-179ad187fd37" path="/var/lib/kubelet/pods/9d85a5b2-cb44-4190-973a-179ad187fd37/volumes" Jan 25 06:04:20 crc kubenswrapper[4728]: I0125 06:04:20.027293 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sbl2s"] Jan 25 06:04:20 crc kubenswrapper[4728]: I0125 06:04:20.035151 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sbl2s"] Jan 25 06:04:21 crc kubenswrapper[4728]: I0125 06:04:21.337376 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639" path="/var/lib/kubelet/pods/bdd0ddf2-b2d4-4568-abc5-7fd76cbcb639/volumes" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.038667 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-72kn4"] Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.044993 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-72kn4"] Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.186919 4728 scope.go:117] "RemoveContainer" containerID="153c3e929ee527e8d0eefcd225e5a333815034615f90b9d17a201bdf5dc4b832" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.222842 4728 scope.go:117] "RemoveContainer" containerID="e21ea8a945a43884e58d6ac8b724dff7592965bb8c6506aaecc4d8213e8f890f" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.251289 4728 scope.go:117] "RemoveContainer" containerID="2587cc2bf4bab8cbfe3b128c260ab728490492a762accd349e62fe11a5e81578" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.288611 4728 scope.go:117] "RemoveContainer" containerID="2e7ba0b9a2b6ea3e9ce3a9d0a2d5cdb7c7fc910b1849cc30580dc23438151e33" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.307874 4728 scope.go:117] "RemoveContainer" containerID="948d5d216e63f9a49474fb936dfb1138a6f5a8b909b42e01b8bab5c42584911a" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.352909 4728 scope.go:117] "RemoveContainer" containerID="35863c313e57052c1427c3c7d10d10a7664fc2dbaa5d801ca8fc1e6958f77af2" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.371920 4728 scope.go:117] "RemoveContainer" containerID="9b16d03f47f8d52a84050392808ba88d387bafab010b302c21dc736bc248968e" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.392918 4728 scope.go:117] "RemoveContainer" containerID="34d7a4a61b459b8b560ca6a4a343deed5bd1033a3bb04f2f77b293561bdcdfe3" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.414717 4728 scope.go:117] "RemoveContainer" containerID="7468c23e4107a46410947d49470ab02fe6f1c86036e819b2bee3284d6485c791" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.456865 4728 scope.go:117] "RemoveContainer" containerID="9254c641e6296bc21d55e1dfdc27a045033efaec56fddf85a1c1a79c8777b95a" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.482060 4728 scope.go:117] "RemoveContainer" containerID="ab496c3faf6e958c6f7ddb49fd6d5b1978f109534141e8344a9401aa4eda745f" Jan 25 06:04:32 crc kubenswrapper[4728]: I0125 06:04:32.507967 4728 scope.go:117] "RemoveContainer" containerID="7290a07d6a05d00f86f00dcd4ab47bd33b677d8ce2c9aca1e22ac4f532e78615" Jan 25 06:04:33 crc kubenswrapper[4728]: I0125 06:04:33.336210 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="554cda4a-e73e-4f9c-93aa-23c41ef468a5" path="/var/lib/kubelet/pods/554cda4a-e73e-4f9c-93aa-23c41ef468a5/volumes" Jan 25 06:04:37 crc kubenswrapper[4728]: I0125 06:04:37.029455 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-csg7p"] Jan 25 06:04:37 crc kubenswrapper[4728]: I0125 06:04:37.035418 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-csg7p"] Jan 25 06:04:37 crc kubenswrapper[4728]: I0125 06:04:37.338764 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d5365c-76ac-4544-b1e2-ae442ee191dd" path="/var/lib/kubelet/pods/74d5365c-76ac-4544-b1e2-ae442ee191dd/volumes" Jan 25 06:04:41 crc kubenswrapper[4728]: I0125 06:04:41.406569 4728 generic.go:334] "Generic (PLEG): container finished" podID="4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d" containerID="6a869fb7ea18189c8eeaa5e46549537eaad9d2637ccd4d665fa08057336a6136" exitCode=0 Jan 25 06:04:41 crc kubenswrapper[4728]: I0125 06:04:41.406657 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" event={"ID":"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d","Type":"ContainerDied","Data":"6a869fb7ea18189c8eeaa5e46549537eaad9d2637ccd4d665fa08057336a6136"} Jan 25 06:04:42 crc kubenswrapper[4728]: I0125 06:04:42.794830 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:42 crc kubenswrapper[4728]: I0125 06:04:42.885893 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-ssh-key-openstack-edpm-ipam\") pod \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " Jan 25 06:04:42 crc kubenswrapper[4728]: I0125 06:04:42.910665 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d" (UID: "4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:04:42 crc kubenswrapper[4728]: I0125 06:04:42.987487 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-inventory\") pod \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " Jan 25 06:04:42 crc kubenswrapper[4728]: I0125 06:04:42.987565 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khwb8\" (UniqueName: \"kubernetes.io/projected/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-kube-api-access-khwb8\") pod \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\" (UID: \"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d\") " Jan 25 06:04:42 crc kubenswrapper[4728]: I0125 06:04:42.988167 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:42 crc kubenswrapper[4728]: I0125 06:04:42.990646 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-kube-api-access-khwb8" (OuterVolumeSpecName: "kube-api-access-khwb8") pod "4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d" (UID: "4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d"). InnerVolumeSpecName "kube-api-access-khwb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.007897 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-inventory" (OuterVolumeSpecName: "inventory") pod "4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d" (UID: "4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.089774 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.089809 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khwb8\" (UniqueName: \"kubernetes.io/projected/4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d-kube-api-access-khwb8\") on node \"crc\" DevicePath \"\"" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.423098 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" event={"ID":"4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d","Type":"ContainerDied","Data":"92ab597371939a97701918612882ae4233a8fec320078a050bd1a698f814f1b6"} Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.423140 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ab597371939a97701918612882ae4233a8fec320078a050bd1a698f814f1b6" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.423220 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qx5hk" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.491369 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs"] Jan 25 06:04:43 crc kubenswrapper[4728]: E0125 06:04:43.491898 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.491919 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.492097 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.492790 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.495795 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.495954 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.495981 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.496163 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.497918 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.497996 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwtrr\" (UniqueName: \"kubernetes.io/projected/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-kube-api-access-vwtrr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.498036 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.502017 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs"] Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.600542 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.600674 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwtrr\" (UniqueName: \"kubernetes.io/projected/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-kube-api-access-vwtrr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.600722 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.606274 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.606374 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.614976 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwtrr\" (UniqueName: \"kubernetes.io/projected/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-kube-api-access-vwtrr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:43 crc kubenswrapper[4728]: I0125 06:04:43.811983 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:04:44 crc kubenswrapper[4728]: I0125 06:04:44.270001 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs"] Jan 25 06:04:44 crc kubenswrapper[4728]: I0125 06:04:44.434018 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" event={"ID":"c70ce086-a2f9-4979-b292-aa69dc5f9bc3","Type":"ContainerStarted","Data":"41e0e2809ebad60118250ce4aa6e86700c491f047eadd9d7cd1d2c392cec2667"} Jan 25 06:04:45 crc kubenswrapper[4728]: I0125 06:04:45.442673 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" event={"ID":"c70ce086-a2f9-4979-b292-aa69dc5f9bc3","Type":"ContainerStarted","Data":"5e6df71324da97badaa1a519abafbb396921c7bea861bccd867bb0dd92f17e7c"} Jan 25 06:04:45 crc kubenswrapper[4728]: I0125 06:04:45.467660 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" podStartSLOduration=1.965783152 podStartE2EDuration="2.467634049s" podCreationTimestamp="2026-01-25 06:04:43 +0000 UTC" firstStartedPulling="2026-01-25 06:04:44.274752273 +0000 UTC m=+1575.310630253" lastFinishedPulling="2026-01-25 06:04:44.77660317 +0000 UTC m=+1575.812481150" observedRunningTime="2026-01-25 06:04:45.456833274 +0000 UTC m=+1576.492711255" watchObservedRunningTime="2026-01-25 06:04:45.467634049 +0000 UTC m=+1576.503512028" Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.044859 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e2fd-account-create-update-5n5s4"] Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.056434 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4eec-account-create-update-m2z8d"] Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.063209 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-cxrhx"] Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.069839 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-cxrhx"] Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.074879 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4eec-account-create-update-m2z8d"] Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.079609 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e2fd-account-create-update-5n5s4"] Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.339123 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0154728c-361d-44a4-85ca-39167e69fc69" path="/var/lib/kubelet/pods/0154728c-361d-44a4-85ca-39167e69fc69/volumes" Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.340032 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c35f4a-a6c8-4be9-9ff5-558125a8e2bb" path="/var/lib/kubelet/pods/33c35f4a-a6c8-4be9-9ff5-558125a8e2bb/volumes" Jan 25 06:05:15 crc kubenswrapper[4728]: I0125 06:05:15.340728 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65872da3-e443-40db-8a63-96aea1382b3f" path="/var/lib/kubelet/pods/65872da3-e443-40db-8a63-96aea1382b3f/volumes" Jan 25 06:05:16 crc kubenswrapper[4728]: I0125 06:05:16.038568 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bbea-account-create-update-t7pck"] Jan 25 06:05:16 crc kubenswrapper[4728]: I0125 06:05:16.044836 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9gc2d"] Jan 25 06:05:16 crc kubenswrapper[4728]: I0125 06:05:16.051309 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9gc2d"] Jan 25 06:05:16 crc kubenswrapper[4728]: I0125 06:05:16.056287 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hz4pk"] Jan 25 06:05:16 crc kubenswrapper[4728]: I0125 06:05:16.067186 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-bbea-account-create-update-t7pck"] Jan 25 06:05:16 crc kubenswrapper[4728]: I0125 06:05:16.075036 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hz4pk"] Jan 25 06:05:17 crc kubenswrapper[4728]: I0125 06:05:17.338023 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17bc2814-7f95-490f-8ac3-2596dad8acc7" path="/var/lib/kubelet/pods/17bc2814-7f95-490f-8ac3-2596dad8acc7/volumes" Jan 25 06:05:17 crc kubenswrapper[4728]: I0125 06:05:17.338862 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5739844d-c36a-4b25-b5e1-6049a9be36a5" path="/var/lib/kubelet/pods/5739844d-c36a-4b25-b5e1-6049a9be36a5/volumes" Jan 25 06:05:17 crc kubenswrapper[4728]: I0125 06:05:17.339402 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a338507-ec0d-4723-9eff-8242791ac1e4" path="/var/lib/kubelet/pods/5a338507-ec0d-4723-9eff-8242791ac1e4/volumes" Jan 25 06:05:19 crc kubenswrapper[4728]: I0125 06:05:19.752013 4728 generic.go:334] "Generic (PLEG): container finished" podID="c70ce086-a2f9-4979-b292-aa69dc5f9bc3" containerID="5e6df71324da97badaa1a519abafbb396921c7bea861bccd867bb0dd92f17e7c" exitCode=0 Jan 25 06:05:19 crc kubenswrapper[4728]: I0125 06:05:19.752103 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" event={"ID":"c70ce086-a2f9-4979-b292-aa69dc5f9bc3","Type":"ContainerDied","Data":"5e6df71324da97badaa1a519abafbb396921c7bea861bccd867bb0dd92f17e7c"} Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.054252 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.194494 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-ssh-key-openstack-edpm-ipam\") pod \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.194865 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-inventory\") pod \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.195018 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwtrr\" (UniqueName: \"kubernetes.io/projected/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-kube-api-access-vwtrr\") pod \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\" (UID: \"c70ce086-a2f9-4979-b292-aa69dc5f9bc3\") " Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.201189 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-kube-api-access-vwtrr" (OuterVolumeSpecName: "kube-api-access-vwtrr") pod "c70ce086-a2f9-4979-b292-aa69dc5f9bc3" (UID: "c70ce086-a2f9-4979-b292-aa69dc5f9bc3"). InnerVolumeSpecName "kube-api-access-vwtrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.220186 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c70ce086-a2f9-4979-b292-aa69dc5f9bc3" (UID: "c70ce086-a2f9-4979-b292-aa69dc5f9bc3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.220557 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-inventory" (OuterVolumeSpecName: "inventory") pod "c70ce086-a2f9-4979-b292-aa69dc5f9bc3" (UID: "c70ce086-a2f9-4979-b292-aa69dc5f9bc3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.298476 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.298510 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwtrr\" (UniqueName: \"kubernetes.io/projected/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-kube-api-access-vwtrr\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.298524 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70ce086-a2f9-4979-b292-aa69dc5f9bc3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.768749 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" event={"ID":"c70ce086-a2f9-4979-b292-aa69dc5f9bc3","Type":"ContainerDied","Data":"41e0e2809ebad60118250ce4aa6e86700c491f047eadd9d7cd1d2c392cec2667"} Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.769081 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41e0e2809ebad60118250ce4aa6e86700c491f047eadd9d7cd1d2c392cec2667" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.768826 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.827009 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kxvkl"] Jan 25 06:05:21 crc kubenswrapper[4728]: E0125 06:05:21.827521 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70ce086-a2f9-4979-b292-aa69dc5f9bc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.827542 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70ce086-a2f9-4979-b292-aa69dc5f9bc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.827745 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70ce086-a2f9-4979-b292-aa69dc5f9bc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.828474 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.830211 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.830743 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.830889 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.831525 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.833448 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kxvkl"] Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.906817 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.906936 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrnxf\" (UniqueName: \"kubernetes.io/projected/631e7d55-5830-4e5c-9ca0-65029b5b30af-kube-api-access-lrnxf\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:21 crc kubenswrapper[4728]: I0125 06:05:21.906994 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.008054 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.008133 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrnxf\" (UniqueName: \"kubernetes.io/projected/631e7d55-5830-4e5c-9ca0-65029b5b30af-kube-api-access-lrnxf\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.008176 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.014923 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.015406 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.023572 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrnxf\" (UniqueName: \"kubernetes.io/projected/631e7d55-5830-4e5c-9ca0-65029b5b30af-kube-api-access-lrnxf\") pod \"ssh-known-hosts-edpm-deployment-kxvkl\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.148961 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.614019 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kxvkl"] Jan 25 06:05:22 crc kubenswrapper[4728]: I0125 06:05:22.779920 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" event={"ID":"631e7d55-5830-4e5c-9ca0-65029b5b30af","Type":"ContainerStarted","Data":"48c301e5aa5cad1dca0624679bc0e33e9cd826ae9ee0b90846fbdd3b59799223"} Jan 25 06:05:23 crc kubenswrapper[4728]: I0125 06:05:23.789437 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" event={"ID":"631e7d55-5830-4e5c-9ca0-65029b5b30af","Type":"ContainerStarted","Data":"a0e347108cb53ce3802b6605c101ca0d8ed0dae9e98662c61304c311fa9d6a7b"} Jan 25 06:05:23 crc kubenswrapper[4728]: I0125 06:05:23.806296 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" podStartSLOduration=2.097199355 podStartE2EDuration="2.806278823s" podCreationTimestamp="2026-01-25 06:05:21 +0000 UTC" firstStartedPulling="2026-01-25 06:05:22.616816744 +0000 UTC m=+1613.652694725" lastFinishedPulling="2026-01-25 06:05:23.325896213 +0000 UTC m=+1614.361774193" observedRunningTime="2026-01-25 06:05:23.800258893 +0000 UTC m=+1614.836136863" watchObservedRunningTime="2026-01-25 06:05:23.806278823 +0000 UTC m=+1614.842156803" Jan 25 06:05:28 crc kubenswrapper[4728]: I0125 06:05:28.830676 4728 generic.go:334] "Generic (PLEG): container finished" podID="631e7d55-5830-4e5c-9ca0-65029b5b30af" containerID="a0e347108cb53ce3802b6605c101ca0d8ed0dae9e98662c61304c311fa9d6a7b" exitCode=0 Jan 25 06:05:28 crc kubenswrapper[4728]: I0125 06:05:28.830774 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" event={"ID":"631e7d55-5830-4e5c-9ca0-65029b5b30af","Type":"ContainerDied","Data":"a0e347108cb53ce3802b6605c101ca0d8ed0dae9e98662c61304c311fa9d6a7b"} Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.173914 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.262044 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-inventory-0\") pod \"631e7d55-5830-4e5c-9ca0-65029b5b30af\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.262114 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-ssh-key-openstack-edpm-ipam\") pod \"631e7d55-5830-4e5c-9ca0-65029b5b30af\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.262259 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrnxf\" (UniqueName: \"kubernetes.io/projected/631e7d55-5830-4e5c-9ca0-65029b5b30af-kube-api-access-lrnxf\") pod \"631e7d55-5830-4e5c-9ca0-65029b5b30af\" (UID: \"631e7d55-5830-4e5c-9ca0-65029b5b30af\") " Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.268548 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631e7d55-5830-4e5c-9ca0-65029b5b30af-kube-api-access-lrnxf" (OuterVolumeSpecName: "kube-api-access-lrnxf") pod "631e7d55-5830-4e5c-9ca0-65029b5b30af" (UID: "631e7d55-5830-4e5c-9ca0-65029b5b30af"). InnerVolumeSpecName "kube-api-access-lrnxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.287206 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "631e7d55-5830-4e5c-9ca0-65029b5b30af" (UID: "631e7d55-5830-4e5c-9ca0-65029b5b30af"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.287807 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "631e7d55-5830-4e5c-9ca0-65029b5b30af" (UID: "631e7d55-5830-4e5c-9ca0-65029b5b30af"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.364853 4728 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.364883 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/631e7d55-5830-4e5c-9ca0-65029b5b30af-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.364896 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrnxf\" (UniqueName: \"kubernetes.io/projected/631e7d55-5830-4e5c-9ca0-65029b5b30af-kube-api-access-lrnxf\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.848425 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" event={"ID":"631e7d55-5830-4e5c-9ca0-65029b5b30af","Type":"ContainerDied","Data":"48c301e5aa5cad1dca0624679bc0e33e9cd826ae9ee0b90846fbdd3b59799223"} Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.848475 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48c301e5aa5cad1dca0624679bc0e33e9cd826ae9ee0b90846fbdd3b59799223" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.848525 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kxvkl" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.905539 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd"] Jan 25 06:05:30 crc kubenswrapper[4728]: E0125 06:05:30.905998 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631e7d55-5830-4e5c-9ca0-65029b5b30af" containerName="ssh-known-hosts-edpm-deployment" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.906020 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="631e7d55-5830-4e5c-9ca0-65029b5b30af" containerName="ssh-known-hosts-edpm-deployment" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.906251 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="631e7d55-5830-4e5c-9ca0-65029b5b30af" containerName="ssh-known-hosts-edpm-deployment" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.906974 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.908620 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.908737 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.909710 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.910247 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.914536 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd"] Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.974762 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.975376 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4r8\" (UniqueName: \"kubernetes.io/projected/568f0cc6-4228-4797-b3af-aa2e43b30c83-kube-api-access-qb4r8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:30 crc kubenswrapper[4728]: I0125 06:05:30.975567 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.077421 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.077572 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4r8\" (UniqueName: \"kubernetes.io/projected/568f0cc6-4228-4797-b3af-aa2e43b30c83-kube-api-access-qb4r8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.077660 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.083047 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.085054 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.091199 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4r8\" (UniqueName: \"kubernetes.io/projected/568f0cc6-4228-4797-b3af-aa2e43b30c83-kube-api-access-qb4r8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-shzgd\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.224001 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.678765 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd"] Jan 25 06:05:31 crc kubenswrapper[4728]: I0125 06:05:31.856531 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" event={"ID":"568f0cc6-4228-4797-b3af-aa2e43b30c83","Type":"ContainerStarted","Data":"59e1dacf6b01ecaf79aee8dd245d08d0bd2aac53fb1cc008f811ddccc2064b85"} Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.726742 4728 scope.go:117] "RemoveContainer" containerID="500f560ec3f40f191a19e50ca98e8c3ffc055aa51f434859f9c351444f7f3edc" Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.744965 4728 scope.go:117] "RemoveContainer" containerID="b1bd0c270fc8ad76a9b0d0904d0f0d261f0d39f58bfc0bc1be3adf8b2548d686" Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.780874 4728 scope.go:117] "RemoveContainer" containerID="1105474da8029154b0e2d922ce0acf9a4715799696e38890160e1a9bdcc93af9" Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.820613 4728 scope.go:117] "RemoveContainer" containerID="1249916ce16c9d7ed67ca807e9a4d754240ae61e4decc5e91ccd83963528b1ac" Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.846202 4728 scope.go:117] "RemoveContainer" containerID="71c95086baeef57765b0e9204bbd38d6bb79ab580e92ae66a96331630c9baeb2" Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.867690 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" event={"ID":"568f0cc6-4228-4797-b3af-aa2e43b30c83","Type":"ContainerStarted","Data":"30bd3a14c66b2671dd34896ca0638d6fc424a6dd3fbf217ae4cf04f5d1c6010f"} Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.889485 4728 scope.go:117] "RemoveContainer" containerID="2ded8a485fbb83fb007a59ec779a1bd8cb40c14224ec013eca052af6927fdb08" Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.912781 4728 scope.go:117] "RemoveContainer" containerID="e1956bf2c5c63b58012e403c54c3d0afeeca580ba94af54e2770a7b7cb79ea2a" Jan 25 06:05:32 crc kubenswrapper[4728]: I0125 06:05:32.935497 4728 scope.go:117] "RemoveContainer" containerID="6b6ea53435ef08771550d35acdc38583da5bb60b7bc6538c81902756543cfe99" Jan 25 06:05:38 crc kubenswrapper[4728]: I0125 06:05:38.030423 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" podStartSLOduration=7.490412157 podStartE2EDuration="8.030400499s" podCreationTimestamp="2026-01-25 06:05:30 +0000 UTC" firstStartedPulling="2026-01-25 06:05:31.681856661 +0000 UTC m=+1622.717734642" lastFinishedPulling="2026-01-25 06:05:32.221845004 +0000 UTC m=+1623.257722984" observedRunningTime="2026-01-25 06:05:32.889381315 +0000 UTC m=+1623.925259305" watchObservedRunningTime="2026-01-25 06:05:38.030400499 +0000 UTC m=+1629.066278479" Jan 25 06:05:38 crc kubenswrapper[4728]: I0125 06:05:38.037549 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zztbh"] Jan 25 06:05:38 crc kubenswrapper[4728]: I0125 06:05:38.043601 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zztbh"] Jan 25 06:05:38 crc kubenswrapper[4728]: I0125 06:05:38.920998 4728 generic.go:334] "Generic (PLEG): container finished" podID="568f0cc6-4228-4797-b3af-aa2e43b30c83" containerID="30bd3a14c66b2671dd34896ca0638d6fc424a6dd3fbf217ae4cf04f5d1c6010f" exitCode=0 Jan 25 06:05:38 crc kubenswrapper[4728]: I0125 06:05:38.921123 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" event={"ID":"568f0cc6-4228-4797-b3af-aa2e43b30c83","Type":"ContainerDied","Data":"30bd3a14c66b2671dd34896ca0638d6fc424a6dd3fbf217ae4cf04f5d1c6010f"} Jan 25 06:05:39 crc kubenswrapper[4728]: I0125 06:05:39.339559 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5da755b-2c82-4436-ab58-bc22b7888ae4" path="/var/lib/kubelet/pods/f5da755b-2c82-4436-ab58-bc22b7888ae4/volumes" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.258760 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.359502 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-ssh-key-openstack-edpm-ipam\") pod \"568f0cc6-4228-4797-b3af-aa2e43b30c83\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.359677 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-inventory\") pod \"568f0cc6-4228-4797-b3af-aa2e43b30c83\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.359725 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4r8\" (UniqueName: \"kubernetes.io/projected/568f0cc6-4228-4797-b3af-aa2e43b30c83-kube-api-access-qb4r8\") pod \"568f0cc6-4228-4797-b3af-aa2e43b30c83\" (UID: \"568f0cc6-4228-4797-b3af-aa2e43b30c83\") " Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.366621 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/568f0cc6-4228-4797-b3af-aa2e43b30c83-kube-api-access-qb4r8" (OuterVolumeSpecName: "kube-api-access-qb4r8") pod "568f0cc6-4228-4797-b3af-aa2e43b30c83" (UID: "568f0cc6-4228-4797-b3af-aa2e43b30c83"). InnerVolumeSpecName "kube-api-access-qb4r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.386753 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "568f0cc6-4228-4797-b3af-aa2e43b30c83" (UID: "568f0cc6-4228-4797-b3af-aa2e43b30c83"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.387288 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-inventory" (OuterVolumeSpecName: "inventory") pod "568f0cc6-4228-4797-b3af-aa2e43b30c83" (UID: "568f0cc6-4228-4797-b3af-aa2e43b30c83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.463602 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.464000 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0cc6-4228-4797-b3af-aa2e43b30c83-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.464017 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4r8\" (UniqueName: \"kubernetes.io/projected/568f0cc6-4228-4797-b3af-aa2e43b30c83-kube-api-access-qb4r8\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.938583 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" event={"ID":"568f0cc6-4228-4797-b3af-aa2e43b30c83","Type":"ContainerDied","Data":"59e1dacf6b01ecaf79aee8dd245d08d0bd2aac53fb1cc008f811ddccc2064b85"} Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.938631 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59e1dacf6b01ecaf79aee8dd245d08d0bd2aac53fb1cc008f811ddccc2064b85" Jan 25 06:05:40 crc kubenswrapper[4728]: I0125 06:05:40.938799 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-shzgd" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.011314 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl"] Jan 25 06:05:41 crc kubenswrapper[4728]: E0125 06:05:41.011873 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568f0cc6-4228-4797-b3af-aa2e43b30c83" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.011895 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="568f0cc6-4228-4797-b3af-aa2e43b30c83" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.012177 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="568f0cc6-4228-4797-b3af-aa2e43b30c83" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.013090 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.014770 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.015151 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.015300 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.015682 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.028589 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl"] Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.081909 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.082281 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.082511 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zww\" (UniqueName: \"kubernetes.io/projected/291a401e-d560-4e70-b979-57f86593a3b3-kube-api-access-d9zww\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.184099 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zww\" (UniqueName: \"kubernetes.io/projected/291a401e-d560-4e70-b979-57f86593a3b3-kube-api-access-d9zww\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.184206 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.184371 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.191429 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.192120 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.200822 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zww\" (UniqueName: \"kubernetes.io/projected/291a401e-d560-4e70-b979-57f86593a3b3-kube-api-access-d9zww\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.331221 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.787080 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl"] Jan 25 06:05:41 crc kubenswrapper[4728]: I0125 06:05:41.948547 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" event={"ID":"291a401e-d560-4e70-b979-57f86593a3b3","Type":"ContainerStarted","Data":"bf02e834afd236cfaada72696605ef84074fcc556c8b2cb77057acb03924c0b5"} Jan 25 06:05:42 crc kubenswrapper[4728]: I0125 06:05:42.905691 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:05:42 crc kubenswrapper[4728]: I0125 06:05:42.906000 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:05:42 crc kubenswrapper[4728]: I0125 06:05:42.957009 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" event={"ID":"291a401e-d560-4e70-b979-57f86593a3b3","Type":"ContainerStarted","Data":"0ae768331a03723d570da6c2aa5fd18dc8f03604470680e99c26d1d4a4c92764"} Jan 25 06:05:42 crc kubenswrapper[4728]: I0125 06:05:42.973688 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" podStartSLOduration=2.401220749 podStartE2EDuration="2.973666805s" podCreationTimestamp="2026-01-25 06:05:40 +0000 UTC" firstStartedPulling="2026-01-25 06:05:41.791291935 +0000 UTC m=+1632.827169915" lastFinishedPulling="2026-01-25 06:05:42.363738 +0000 UTC m=+1633.399615971" observedRunningTime="2026-01-25 06:05:42.970710209 +0000 UTC m=+1634.006588189" watchObservedRunningTime="2026-01-25 06:05:42.973666805 +0000 UTC m=+1634.009544784" Jan 25 06:05:50 crc kubenswrapper[4728]: I0125 06:05:50.012517 4728 generic.go:334] "Generic (PLEG): container finished" podID="291a401e-d560-4e70-b979-57f86593a3b3" containerID="0ae768331a03723d570da6c2aa5fd18dc8f03604470680e99c26d1d4a4c92764" exitCode=0 Jan 25 06:05:50 crc kubenswrapper[4728]: I0125 06:05:50.012555 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" event={"ID":"291a401e-d560-4e70-b979-57f86593a3b3","Type":"ContainerDied","Data":"0ae768331a03723d570da6c2aa5fd18dc8f03604470680e99c26d1d4a4c92764"} Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.360865 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.486965 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-inventory\") pod \"291a401e-d560-4e70-b979-57f86593a3b3\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.487111 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-ssh-key-openstack-edpm-ipam\") pod \"291a401e-d560-4e70-b979-57f86593a3b3\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.487151 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9zww\" (UniqueName: \"kubernetes.io/projected/291a401e-d560-4e70-b979-57f86593a3b3-kube-api-access-d9zww\") pod \"291a401e-d560-4e70-b979-57f86593a3b3\" (UID: \"291a401e-d560-4e70-b979-57f86593a3b3\") " Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.492943 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291a401e-d560-4e70-b979-57f86593a3b3-kube-api-access-d9zww" (OuterVolumeSpecName: "kube-api-access-d9zww") pod "291a401e-d560-4e70-b979-57f86593a3b3" (UID: "291a401e-d560-4e70-b979-57f86593a3b3"). InnerVolumeSpecName "kube-api-access-d9zww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.510728 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-inventory" (OuterVolumeSpecName: "inventory") pod "291a401e-d560-4e70-b979-57f86593a3b3" (UID: "291a401e-d560-4e70-b979-57f86593a3b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.510778 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "291a401e-d560-4e70-b979-57f86593a3b3" (UID: "291a401e-d560-4e70-b979-57f86593a3b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.589830 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.589862 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/291a401e-d560-4e70-b979-57f86593a3b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:51 crc kubenswrapper[4728]: I0125 06:05:51.589877 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9zww\" (UniqueName: \"kubernetes.io/projected/291a401e-d560-4e70-b979-57f86593a3b3-kube-api-access-d9zww\") on node \"crc\" DevicePath \"\"" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.030997 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" event={"ID":"291a401e-d560-4e70-b979-57f86593a3b3","Type":"ContainerDied","Data":"bf02e834afd236cfaada72696605ef84074fcc556c8b2cb77057acb03924c0b5"} Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.031050 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf02e834afd236cfaada72696605ef84074fcc556c8b2cb77057acb03924c0b5" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.031058 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.094126 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd"] Jan 25 06:05:52 crc kubenswrapper[4728]: E0125 06:05:52.094834 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291a401e-d560-4e70-b979-57f86593a3b3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.094860 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="291a401e-d560-4e70-b979-57f86593a3b3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.095068 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="291a401e-d560-4e70-b979-57f86593a3b3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.095850 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.098560 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.098612 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.098645 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.098672 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.098918 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.098988 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099081 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099106 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099133 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wrpn\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-kube-api-access-5wrpn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099279 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099388 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099441 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099489 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099520 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.099677 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.100025 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.100128 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.101646 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.101665 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.101864 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.101932 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.102033 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.105703 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd"] Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.202840 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203265 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203311 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wrpn\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-kube-api-access-5wrpn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203484 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203533 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203573 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203605 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203653 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203743 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203778 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203822 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.203861 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.204085 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.204154 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.209331 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.209891 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.209994 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.210231 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.210574 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.211160 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.211346 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.211908 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.212581 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.213594 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.213699 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.214212 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.214465 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.218607 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wrpn\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-kube-api-access-5wrpn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-49vxd\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.411941 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:05:52 crc kubenswrapper[4728]: I0125 06:05:52.882741 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd"] Jan 25 06:05:53 crc kubenswrapper[4728]: I0125 06:05:53.040170 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" event={"ID":"c7e1052e-86fc-4070-be06-23fef77216d8","Type":"ContainerStarted","Data":"210e435efce7b5ddba1f91d39e911407179d446c4baa7d914e9f4b2f75972733"} Jan 25 06:05:54 crc kubenswrapper[4728]: I0125 06:05:54.048519 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" event={"ID":"c7e1052e-86fc-4070-be06-23fef77216d8","Type":"ContainerStarted","Data":"60943dfc42ad47d562b0051532f4aaedb3726ed79eb252a4ebce6ce28e0efc3b"} Jan 25 06:05:54 crc kubenswrapper[4728]: I0125 06:05:54.062125 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" podStartSLOduration=1.274791925 podStartE2EDuration="2.062108759s" podCreationTimestamp="2026-01-25 06:05:52 +0000 UTC" firstStartedPulling="2026-01-25 06:05:52.890018713 +0000 UTC m=+1643.925896693" lastFinishedPulling="2026-01-25 06:05:53.677335547 +0000 UTC m=+1644.713213527" observedRunningTime="2026-01-25 06:05:54.061648581 +0000 UTC m=+1645.097526562" watchObservedRunningTime="2026-01-25 06:05:54.062108759 +0000 UTC m=+1645.097986739" Jan 25 06:05:58 crc kubenswrapper[4728]: I0125 06:05:58.031206 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tpsmt"] Jan 25 06:05:58 crc kubenswrapper[4728]: I0125 06:05:58.036200 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tpsmt"] Jan 25 06:05:59 crc kubenswrapper[4728]: I0125 06:05:59.025244 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bxr5x"] Jan 25 06:05:59 crc kubenswrapper[4728]: I0125 06:05:59.033273 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bxr5x"] Jan 25 06:05:59 crc kubenswrapper[4728]: I0125 06:05:59.338565 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4211ead7-9238-4898-a53a-ba17b0495bb3" path="/var/lib/kubelet/pods/4211ead7-9238-4898-a53a-ba17b0495bb3/volumes" Jan 25 06:05:59 crc kubenswrapper[4728]: I0125 06:05:59.339281 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3928592-b152-41a4-a787-6f723fdb1839" path="/var/lib/kubelet/pods/e3928592-b152-41a4-a787-6f723fdb1839/volumes" Jan 25 06:06:12 crc kubenswrapper[4728]: I0125 06:06:12.899272 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:06:12 crc kubenswrapper[4728]: I0125 06:06:12.900784 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:06:19 crc kubenswrapper[4728]: I0125 06:06:19.223951 4728 generic.go:334] "Generic (PLEG): container finished" podID="c7e1052e-86fc-4070-be06-23fef77216d8" containerID="60943dfc42ad47d562b0051532f4aaedb3726ed79eb252a4ebce6ce28e0efc3b" exitCode=0 Jan 25 06:06:19 crc kubenswrapper[4728]: I0125 06:06:19.224036 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" event={"ID":"c7e1052e-86fc-4070-be06-23fef77216d8","Type":"ContainerDied","Data":"60943dfc42ad47d562b0051532f4aaedb3726ed79eb252a4ebce6ce28e0efc3b"} Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.599660 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.703452 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ovn-combined-ca-bundle\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.703529 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ssh-key-openstack-edpm-ipam\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.703625 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-inventory\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.703702 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.703776 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.704606 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.704785 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-nova-combined-ca-bundle\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.704858 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-neutron-metadata-combined-ca-bundle\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.704880 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-bootstrap-combined-ca-bundle\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.704904 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wrpn\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-kube-api-access-5wrpn\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.704941 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-telemetry-combined-ca-bundle\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.704999 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.705033 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-libvirt-combined-ca-bundle\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.705181 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-repo-setup-combined-ca-bundle\") pod \"c7e1052e-86fc-4070-be06-23fef77216d8\" (UID: \"c7e1052e-86fc-4070-be06-23fef77216d8\") " Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.712580 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.712616 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.713542 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.713602 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-kube-api-access-5wrpn" (OuterVolumeSpecName: "kube-api-access-5wrpn") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "kube-api-access-5wrpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.713996 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.714259 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.714860 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.715299 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.729778 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.729840 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.729886 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.732513 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.736872 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-inventory" (OuterVolumeSpecName: "inventory") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.738564 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c7e1052e-86fc-4070-be06-23fef77216d8" (UID: "c7e1052e-86fc-4070-be06-23fef77216d8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809860 4728 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809890 4728 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809905 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wrpn\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-kube-api-access-5wrpn\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809916 4728 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809929 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809941 4728 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809953 4728 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809978 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.809997 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.810035 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.810046 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.810056 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.810067 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c7e1052e-86fc-4070-be06-23fef77216d8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:20 crc kubenswrapper[4728]: I0125 06:06:20.810078 4728 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1052e-86fc-4070-be06-23fef77216d8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.242944 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" event={"ID":"c7e1052e-86fc-4070-be06-23fef77216d8","Type":"ContainerDied","Data":"210e435efce7b5ddba1f91d39e911407179d446c4baa7d914e9f4b2f75972733"} Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.242996 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="210e435efce7b5ddba1f91d39e911407179d446c4baa7d914e9f4b2f75972733" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.243033 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-49vxd" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.327804 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s"] Jan 25 06:06:21 crc kubenswrapper[4728]: E0125 06:06:21.328404 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e1052e-86fc-4070-be06-23fef77216d8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.328455 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e1052e-86fc-4070-be06-23fef77216d8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.328930 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e1052e-86fc-4070-be06-23fef77216d8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.330202 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.333654 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.334153 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.334369 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.334590 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.334802 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.348819 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s"] Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.421550 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.421633 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krkl5\" (UniqueName: \"kubernetes.io/projected/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-kube-api-access-krkl5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.421673 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.421694 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.421806 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.524281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.524430 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krkl5\" (UniqueName: \"kubernetes.io/projected/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-kube-api-access-krkl5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.524489 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.524528 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.524627 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.525813 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.529014 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.529441 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.530261 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.543438 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krkl5\" (UniqueName: \"kubernetes.io/projected/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-kube-api-access-krkl5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tbq4s\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:21 crc kubenswrapper[4728]: I0125 06:06:21.650933 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:06:22 crc kubenswrapper[4728]: I0125 06:06:22.130865 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s"] Jan 25 06:06:22 crc kubenswrapper[4728]: I0125 06:06:22.252285 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" event={"ID":"81f0bf26-935c-4d92-aa59-8c2e22b87f2f","Type":"ContainerStarted","Data":"77ed1e96e0e7db8e33d5cff3bb3acac4cd9ea2a2528189bdc64dbd3520c37ee9"} Jan 25 06:06:23 crc kubenswrapper[4728]: I0125 06:06:23.262432 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" event={"ID":"81f0bf26-935c-4d92-aa59-8c2e22b87f2f","Type":"ContainerStarted","Data":"d2bcc8174dee446096361626cbd425f13666f3b6f30752eadcb1b816e65800dd"} Jan 25 06:06:23 crc kubenswrapper[4728]: I0125 06:06:23.281263 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" podStartSLOduration=1.695642405 podStartE2EDuration="2.281243366s" podCreationTimestamp="2026-01-25 06:06:21 +0000 UTC" firstStartedPulling="2026-01-25 06:06:22.136226791 +0000 UTC m=+1673.172104771" lastFinishedPulling="2026-01-25 06:06:22.721827752 +0000 UTC m=+1673.757705732" observedRunningTime="2026-01-25 06:06:23.280962176 +0000 UTC m=+1674.316840156" watchObservedRunningTime="2026-01-25 06:06:23.281243366 +0000 UTC m=+1674.317121346" Jan 25 06:06:33 crc kubenswrapper[4728]: I0125 06:06:33.070398 4728 scope.go:117] "RemoveContainer" containerID="8827d8f2a96d7732dea844b3c9fae235e05bcd2bc26068b2908e3ffc81948cc1" Jan 25 06:06:33 crc kubenswrapper[4728]: I0125 06:06:33.108136 4728 scope.go:117] "RemoveContainer" containerID="1310718fe65447406c8e15ada9ffccd88ed8edced5e206ef84f6b272958be2fc" Jan 25 06:06:33 crc kubenswrapper[4728]: I0125 06:06:33.143622 4728 scope.go:117] "RemoveContainer" containerID="4fd745d14345d24411a0e9bb2f13fb4238c51c27ced5dda05b89271297428522" Jan 25 06:06:42 crc kubenswrapper[4728]: I0125 06:06:42.899897 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:06:42 crc kubenswrapper[4728]: I0125 06:06:42.900615 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:06:42 crc kubenswrapper[4728]: I0125 06:06:42.900678 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 06:06:42 crc kubenswrapper[4728]: I0125 06:06:42.905621 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 06:06:42 crc kubenswrapper[4728]: I0125 06:06:42.905705 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" gracePeriod=600 Jan 25 06:06:43 crc kubenswrapper[4728]: E0125 06:06:43.040757 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:06:43 crc kubenswrapper[4728]: I0125 06:06:43.046380 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-h9fs9"] Jan 25 06:06:43 crc kubenswrapper[4728]: I0125 06:06:43.052202 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-h9fs9"] Jan 25 06:06:43 crc kubenswrapper[4728]: I0125 06:06:43.338693 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6804529a-198c-458a-98a3-4bcb6685b74c" path="/var/lib/kubelet/pods/6804529a-198c-458a-98a3-4bcb6685b74c/volumes" Jan 25 06:06:43 crc kubenswrapper[4728]: I0125 06:06:43.440449 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" exitCode=0 Jan 25 06:06:43 crc kubenswrapper[4728]: I0125 06:06:43.440541 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29"} Jan 25 06:06:43 crc kubenswrapper[4728]: I0125 06:06:43.440631 4728 scope.go:117] "RemoveContainer" containerID="0a4f5085ca82c5966b307e41854c331b5cdb7a2b51d47df89db30b0d0eaf56ac" Jan 25 06:06:43 crc kubenswrapper[4728]: I0125 06:06:43.441209 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:06:43 crc kubenswrapper[4728]: E0125 06:06:43.441528 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:06:57 crc kubenswrapper[4728]: I0125 06:06:57.328813 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:06:57 crc kubenswrapper[4728]: E0125 06:06:57.329730 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:07:07 crc kubenswrapper[4728]: I0125 06:07:07.621920 4728 generic.go:334] "Generic (PLEG): container finished" podID="81f0bf26-935c-4d92-aa59-8c2e22b87f2f" containerID="d2bcc8174dee446096361626cbd425f13666f3b6f30752eadcb1b816e65800dd" exitCode=0 Jan 25 06:07:07 crc kubenswrapper[4728]: I0125 06:07:07.621981 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" event={"ID":"81f0bf26-935c-4d92-aa59-8c2e22b87f2f","Type":"ContainerDied","Data":"d2bcc8174dee446096361626cbd425f13666f3b6f30752eadcb1b816e65800dd"} Jan 25 06:07:08 crc kubenswrapper[4728]: I0125 06:07:08.961484 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.099776 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-inventory\") pod \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.099846 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krkl5\" (UniqueName: \"kubernetes.io/projected/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-kube-api-access-krkl5\") pod \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.099909 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ssh-key-openstack-edpm-ipam\") pod \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.100226 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovncontroller-config-0\") pod \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.100286 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovn-combined-ca-bundle\") pod \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\" (UID: \"81f0bf26-935c-4d92-aa59-8c2e22b87f2f\") " Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.107169 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "81f0bf26-935c-4d92-aa59-8c2e22b87f2f" (UID: "81f0bf26-935c-4d92-aa59-8c2e22b87f2f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.109185 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-kube-api-access-krkl5" (OuterVolumeSpecName: "kube-api-access-krkl5") pod "81f0bf26-935c-4d92-aa59-8c2e22b87f2f" (UID: "81f0bf26-935c-4d92-aa59-8c2e22b87f2f"). InnerVolumeSpecName "kube-api-access-krkl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.127425 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "81f0bf26-935c-4d92-aa59-8c2e22b87f2f" (UID: "81f0bf26-935c-4d92-aa59-8c2e22b87f2f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.129988 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-inventory" (OuterVolumeSpecName: "inventory") pod "81f0bf26-935c-4d92-aa59-8c2e22b87f2f" (UID: "81f0bf26-935c-4d92-aa59-8c2e22b87f2f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.138220 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "81f0bf26-935c-4d92-aa59-8c2e22b87f2f" (UID: "81f0bf26-935c-4d92-aa59-8c2e22b87f2f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.203422 4728 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.203466 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.203480 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.203495 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krkl5\" (UniqueName: \"kubernetes.io/projected/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-kube-api-access-krkl5\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.203511 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81f0bf26-935c-4d92-aa59-8c2e22b87f2f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.644392 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" event={"ID":"81f0bf26-935c-4d92-aa59-8c2e22b87f2f","Type":"ContainerDied","Data":"77ed1e96e0e7db8e33d5cff3bb3acac4cd9ea2a2528189bdc64dbd3520c37ee9"} Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.644936 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77ed1e96e0e7db8e33d5cff3bb3acac4cd9ea2a2528189bdc64dbd3520c37ee9" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.644479 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tbq4s" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.711918 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb"] Jan 25 06:07:09 crc kubenswrapper[4728]: E0125 06:07:09.712285 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f0bf26-935c-4d92-aa59-8c2e22b87f2f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.712303 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f0bf26-935c-4d92-aa59-8c2e22b87f2f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.712473 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f0bf26-935c-4d92-aa59-8c2e22b87f2f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.713040 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.715440 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.715482 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.715503 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.715897 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.716182 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.719745 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.721585 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb"] Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.816389 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.816630 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.816831 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.816920 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pr9\" (UniqueName: \"kubernetes.io/projected/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-kube-api-access-52pr9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.817042 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.817122 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.919854 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.920096 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.920165 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52pr9\" (UniqueName: \"kubernetes.io/projected/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-kube-api-access-52pr9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.920749 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.920832 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.921464 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.924520 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.924519 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.925118 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.925486 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.925609 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:09 crc kubenswrapper[4728]: I0125 06:07:09.935658 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pr9\" (UniqueName: \"kubernetes.io/projected/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-kube-api-access-52pr9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:10 crc kubenswrapper[4728]: I0125 06:07:10.024780 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:10 crc kubenswrapper[4728]: I0125 06:07:10.506241 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb"] Jan 25 06:07:10 crc kubenswrapper[4728]: I0125 06:07:10.654942 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" event={"ID":"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4","Type":"ContainerStarted","Data":"14958a954d1fcaf80ac46c6513bd52bb52000ceb6b4151f127c3ac05250bdf59"} Jan 25 06:07:11 crc kubenswrapper[4728]: I0125 06:07:11.664338 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" event={"ID":"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4","Type":"ContainerStarted","Data":"364e137858547993cabd58802866aa2692df33d0458a8e5d6fda3d2ce0eec277"} Jan 25 06:07:11 crc kubenswrapper[4728]: I0125 06:07:11.682722 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" podStartSLOduration=2.171161468 podStartE2EDuration="2.682699443s" podCreationTimestamp="2026-01-25 06:07:09 +0000 UTC" firstStartedPulling="2026-01-25 06:07:10.514988104 +0000 UTC m=+1721.550866084" lastFinishedPulling="2026-01-25 06:07:11.026526079 +0000 UTC m=+1722.062404059" observedRunningTime="2026-01-25 06:07:11.679362651 +0000 UTC m=+1722.715240631" watchObservedRunningTime="2026-01-25 06:07:11.682699443 +0000 UTC m=+1722.718577423" Jan 25 06:07:12 crc kubenswrapper[4728]: I0125 06:07:12.330013 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:07:12 crc kubenswrapper[4728]: E0125 06:07:12.330604 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:07:24 crc kubenswrapper[4728]: I0125 06:07:24.329025 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:07:24 crc kubenswrapper[4728]: E0125 06:07:24.329902 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:07:33 crc kubenswrapper[4728]: I0125 06:07:33.236004 4728 scope.go:117] "RemoveContainer" containerID="78343fd1c215f9b50d40a96050efe3cac79744b37724c31f928ae1c4ff10009d" Jan 25 06:07:39 crc kubenswrapper[4728]: I0125 06:07:39.334559 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:07:39 crc kubenswrapper[4728]: E0125 06:07:39.335898 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:07:46 crc kubenswrapper[4728]: I0125 06:07:46.966096 4728 generic.go:334] "Generic (PLEG): container finished" podID="6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" containerID="364e137858547993cabd58802866aa2692df33d0458a8e5d6fda3d2ce0eec277" exitCode=0 Jan 25 06:07:46 crc kubenswrapper[4728]: I0125 06:07:46.966492 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" event={"ID":"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4","Type":"ContainerDied","Data":"364e137858547993cabd58802866aa2692df33d0458a8e5d6fda3d2ce0eec277"} Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.287158 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.341385 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-ssh-key-openstack-edpm-ipam\") pod \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.341474 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.341618 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-metadata-combined-ca-bundle\") pod \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.341644 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-inventory\") pod \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.341686 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52pr9\" (UniqueName: \"kubernetes.io/projected/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-kube-api-access-52pr9\") pod \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.341709 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-nova-metadata-neutron-config-0\") pod \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\" (UID: \"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4\") " Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.352359 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" (UID: "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.352413 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-kube-api-access-52pr9" (OuterVolumeSpecName: "kube-api-access-52pr9") pod "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" (UID: "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4"). InnerVolumeSpecName "kube-api-access-52pr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.367333 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" (UID: "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.367672 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" (UID: "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.368805 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-inventory" (OuterVolumeSpecName: "inventory") pod "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" (UID: "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.369283 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" (UID: "6f8bee4e-2d23-4efc-81cc-e82bb4466eb4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.445077 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.445123 4728 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.445142 4728 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.445154 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.445165 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52pr9\" (UniqueName: \"kubernetes.io/projected/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-kube-api-access-52pr9\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.445179 4728 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6f8bee4e-2d23-4efc-81cc-e82bb4466eb4-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.985035 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" event={"ID":"6f8bee4e-2d23-4efc-81cc-e82bb4466eb4","Type":"ContainerDied","Data":"14958a954d1fcaf80ac46c6513bd52bb52000ceb6b4151f127c3ac05250bdf59"} Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.985090 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14958a954d1fcaf80ac46c6513bd52bb52000ceb6b4151f127c3ac05250bdf59" Jan 25 06:07:48 crc kubenswrapper[4728]: I0125 06:07:48.985109 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.105545 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd"] Jan 25 06:07:49 crc kubenswrapper[4728]: E0125 06:07:49.106376 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.106405 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.106721 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f8bee4e-2d23-4efc-81cc-e82bb4466eb4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.107810 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.110253 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.110347 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.110364 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.110514 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.111213 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.114808 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd"] Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.260465 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.260536 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.260620 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.260654 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.260773 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcckc\" (UniqueName: \"kubernetes.io/projected/99aea3d5-d496-457b-87b9-95c444db3c76-kube-api-access-gcckc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.363655 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcckc\" (UniqueName: \"kubernetes.io/projected/99aea3d5-d496-457b-87b9-95c444db3c76-kube-api-access-gcckc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.364039 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.364082 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.364170 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.364205 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.370638 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.370761 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.371143 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.373512 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.384640 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcckc\" (UniqueName: \"kubernetes.io/projected/99aea3d5-d496-457b-87b9-95c444db3c76-kube-api-access-gcckc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.426596 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:07:49 crc kubenswrapper[4728]: I0125 06:07:49.901205 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd"] Jan 25 06:07:50 crc kubenswrapper[4728]: I0125 06:07:50.010456 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" event={"ID":"99aea3d5-d496-457b-87b9-95c444db3c76","Type":"ContainerStarted","Data":"251def0137ce5b270321e7cb211fdac999d2bf12b1f7b9c071c61d5d7160f11f"} Jan 25 06:07:51 crc kubenswrapper[4728]: I0125 06:07:51.021316 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" event={"ID":"99aea3d5-d496-457b-87b9-95c444db3c76","Type":"ContainerStarted","Data":"3227b7f81911e1f6e8c24c223ad9ac8d8920ffd8bac2f54183ed0c03a470c599"} Jan 25 06:07:51 crc kubenswrapper[4728]: I0125 06:07:51.042288 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" podStartSLOduration=1.440392948 podStartE2EDuration="2.042267956s" podCreationTimestamp="2026-01-25 06:07:49 +0000 UTC" firstStartedPulling="2026-01-25 06:07:49.905396518 +0000 UTC m=+1760.941274499" lastFinishedPulling="2026-01-25 06:07:50.507271526 +0000 UTC m=+1761.543149507" observedRunningTime="2026-01-25 06:07:51.035717463 +0000 UTC m=+1762.071595443" watchObservedRunningTime="2026-01-25 06:07:51.042267956 +0000 UTC m=+1762.078145935" Jan 25 06:07:53 crc kubenswrapper[4728]: I0125 06:07:53.328985 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:07:53 crc kubenswrapper[4728]: E0125 06:07:53.329570 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:08:06 crc kubenswrapper[4728]: I0125 06:08:06.329085 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:08:06 crc kubenswrapper[4728]: E0125 06:08:06.329900 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:08:18 crc kubenswrapper[4728]: I0125 06:08:18.329610 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:08:18 crc kubenswrapper[4728]: E0125 06:08:18.330658 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:08:31 crc kubenswrapper[4728]: I0125 06:08:31.328937 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:08:31 crc kubenswrapper[4728]: E0125 06:08:31.329543 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:08:42 crc kubenswrapper[4728]: I0125 06:08:42.328769 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:08:42 crc kubenswrapper[4728]: E0125 06:08:42.329885 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:08:57 crc kubenswrapper[4728]: I0125 06:08:57.329435 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:08:57 crc kubenswrapper[4728]: E0125 06:08:57.330416 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:09:09 crc kubenswrapper[4728]: I0125 06:09:09.334407 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:09:09 crc kubenswrapper[4728]: E0125 06:09:09.335164 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:09:23 crc kubenswrapper[4728]: I0125 06:09:23.329116 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:09:23 crc kubenswrapper[4728]: E0125 06:09:23.329955 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:09:36 crc kubenswrapper[4728]: I0125 06:09:36.328525 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:09:36 crc kubenswrapper[4728]: E0125 06:09:36.329459 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:09:51 crc kubenswrapper[4728]: I0125 06:09:51.329462 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:09:51 crc kubenswrapper[4728]: E0125 06:09:51.330364 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:10:04 crc kubenswrapper[4728]: I0125 06:10:04.329505 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:10:04 crc kubenswrapper[4728]: E0125 06:10:04.330510 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:10:15 crc kubenswrapper[4728]: I0125 06:10:15.329210 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:10:15 crc kubenswrapper[4728]: E0125 06:10:15.330227 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:10:30 crc kubenswrapper[4728]: I0125 06:10:30.329313 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:10:30 crc kubenswrapper[4728]: E0125 06:10:30.330500 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:10:37 crc kubenswrapper[4728]: I0125 06:10:37.343862 4728 generic.go:334] "Generic (PLEG): container finished" podID="99aea3d5-d496-457b-87b9-95c444db3c76" containerID="3227b7f81911e1f6e8c24c223ad9ac8d8920ffd8bac2f54183ed0c03a470c599" exitCode=0 Jan 25 06:10:37 crc kubenswrapper[4728]: I0125 06:10:37.343952 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" event={"ID":"99aea3d5-d496-457b-87b9-95c444db3c76","Type":"ContainerDied","Data":"3227b7f81911e1f6e8c24c223ad9ac8d8920ffd8bac2f54183ed0c03a470c599"} Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.695059 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.842933 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-ssh-key-openstack-edpm-ipam\") pod \"99aea3d5-d496-457b-87b9-95c444db3c76\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.843221 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-inventory\") pod \"99aea3d5-d496-457b-87b9-95c444db3c76\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.843448 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcckc\" (UniqueName: \"kubernetes.io/projected/99aea3d5-d496-457b-87b9-95c444db3c76-kube-api-access-gcckc\") pod \"99aea3d5-d496-457b-87b9-95c444db3c76\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.843554 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-secret-0\") pod \"99aea3d5-d496-457b-87b9-95c444db3c76\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.843637 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-combined-ca-bundle\") pod \"99aea3d5-d496-457b-87b9-95c444db3c76\" (UID: \"99aea3d5-d496-457b-87b9-95c444db3c76\") " Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.850282 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "99aea3d5-d496-457b-87b9-95c444db3c76" (UID: "99aea3d5-d496-457b-87b9-95c444db3c76"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.850410 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99aea3d5-d496-457b-87b9-95c444db3c76-kube-api-access-gcckc" (OuterVolumeSpecName: "kube-api-access-gcckc") pod "99aea3d5-d496-457b-87b9-95c444db3c76" (UID: "99aea3d5-d496-457b-87b9-95c444db3c76"). InnerVolumeSpecName "kube-api-access-gcckc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.868436 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "99aea3d5-d496-457b-87b9-95c444db3c76" (UID: "99aea3d5-d496-457b-87b9-95c444db3c76"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.869199 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-inventory" (OuterVolumeSpecName: "inventory") pod "99aea3d5-d496-457b-87b9-95c444db3c76" (UID: "99aea3d5-d496-457b-87b9-95c444db3c76"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.870913 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "99aea3d5-d496-457b-87b9-95c444db3c76" (UID: "99aea3d5-d496-457b-87b9-95c444db3c76"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.946115 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.946417 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.946432 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcckc\" (UniqueName: \"kubernetes.io/projected/99aea3d5-d496-457b-87b9-95c444db3c76-kube-api-access-gcckc\") on node \"crc\" DevicePath \"\"" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.946448 4728 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:10:38 crc kubenswrapper[4728]: I0125 06:10:38.946458 4728 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aea3d5-d496-457b-87b9-95c444db3c76-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.362657 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" event={"ID":"99aea3d5-d496-457b-87b9-95c444db3c76","Type":"ContainerDied","Data":"251def0137ce5b270321e7cb211fdac999d2bf12b1f7b9c071c61d5d7160f11f"} Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.362706 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="251def0137ce5b270321e7cb211fdac999d2bf12b1f7b9c071c61d5d7160f11f" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.362736 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.452623 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds"] Jan 25 06:10:39 crc kubenswrapper[4728]: E0125 06:10:39.453146 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99aea3d5-d496-457b-87b9-95c444db3c76" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.453162 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="99aea3d5-d496-457b-87b9-95c444db3c76" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.453388 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="99aea3d5-d496-457b-87b9-95c444db3c76" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.454092 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455506 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlb45\" (UniqueName: \"kubernetes.io/projected/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-kube-api-access-xlb45\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455547 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455618 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455646 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455685 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455778 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455823 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455851 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.455948 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.457004 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.457472 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.457521 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.457567 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.457478 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.458046 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.458646 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.470359 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds"] Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.557665 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.557726 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.558134 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.558248 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.558304 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.558357 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.558426 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.558511 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlb45\" (UniqueName: \"kubernetes.io/projected/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-kube-api-access-xlb45\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.558539 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.559434 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.563752 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.564241 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.564358 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.564683 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.564816 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.565987 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.565997 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.576175 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlb45\" (UniqueName: \"kubernetes.io/projected/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-kube-api-access-xlb45\") pod \"nova-edpm-deployment-openstack-edpm-ipam-prsds\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:39 crc kubenswrapper[4728]: I0125 06:10:39.773079 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:10:40 crc kubenswrapper[4728]: I0125 06:10:40.241872 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 06:10:40 crc kubenswrapper[4728]: I0125 06:10:40.241877 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds"] Jan 25 06:10:40 crc kubenswrapper[4728]: I0125 06:10:40.392146 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" event={"ID":"6b1b4c44-e390-4a48-aa8a-84f5509ef99e","Type":"ContainerStarted","Data":"b866706876fa144fa35ed3fc46460c3e1562f69898b123f932d132f17a11e359"} Jan 25 06:10:41 crc kubenswrapper[4728]: I0125 06:10:41.401454 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" event={"ID":"6b1b4c44-e390-4a48-aa8a-84f5509ef99e","Type":"ContainerStarted","Data":"dc2ba5e6c04c7f7b8e8acb98a2fe68652fdaf14fe6dbb3a52583cbffb3efece9"} Jan 25 06:10:41 crc kubenswrapper[4728]: I0125 06:10:41.420352 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" podStartSLOduration=1.88124351 podStartE2EDuration="2.420300064s" podCreationTimestamp="2026-01-25 06:10:39 +0000 UTC" firstStartedPulling="2026-01-25 06:10:40.241634658 +0000 UTC m=+1931.277512638" lastFinishedPulling="2026-01-25 06:10:40.780691213 +0000 UTC m=+1931.816569192" observedRunningTime="2026-01-25 06:10:41.419913194 +0000 UTC m=+1932.455791174" watchObservedRunningTime="2026-01-25 06:10:41.420300064 +0000 UTC m=+1932.456178043" Jan 25 06:10:42 crc kubenswrapper[4728]: I0125 06:10:42.329049 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:10:42 crc kubenswrapper[4728]: E0125 06:10:42.329790 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:10:55 crc kubenswrapper[4728]: I0125 06:10:55.329122 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:10:55 crc kubenswrapper[4728]: E0125 06:10:55.330229 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:11:07 crc kubenswrapper[4728]: I0125 06:11:07.330289 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:11:07 crc kubenswrapper[4728]: E0125 06:11:07.331623 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:11:18 crc kubenswrapper[4728]: I0125 06:11:18.329134 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:11:18 crc kubenswrapper[4728]: E0125 06:11:18.329961 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:11:31 crc kubenswrapper[4728]: I0125 06:11:31.328730 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:11:31 crc kubenswrapper[4728]: E0125 06:11:31.329613 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:11:45 crc kubenswrapper[4728]: I0125 06:11:45.329734 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:11:45 crc kubenswrapper[4728]: I0125 06:11:45.965966 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"5d65eb3ab7b3191ef5a90da016917172abb4b15bea97bca8dcfe0198c9442bb6"} Jan 25 06:12:19 crc kubenswrapper[4728]: I0125 06:12:19.277505 4728 generic.go:334] "Generic (PLEG): container finished" podID="6b1b4c44-e390-4a48-aa8a-84f5509ef99e" containerID="dc2ba5e6c04c7f7b8e8acb98a2fe68652fdaf14fe6dbb3a52583cbffb3efece9" exitCode=0 Jan 25 06:12:19 crc kubenswrapper[4728]: I0125 06:12:19.277616 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" event={"ID":"6b1b4c44-e390-4a48-aa8a-84f5509ef99e","Type":"ContainerDied","Data":"dc2ba5e6c04c7f7b8e8acb98a2fe68652fdaf14fe6dbb3a52583cbffb3efece9"} Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.671350 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.780568 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-1\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.780650 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-combined-ca-bundle\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.780823 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-1\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.780868 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlb45\" (UniqueName: \"kubernetes.io/projected/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-kube-api-access-xlb45\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.780894 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-0\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.780934 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-0\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.780995 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-inventory\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.781123 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-extra-config-0\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.781167 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-ssh-key-openstack-edpm-ipam\") pod \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\" (UID: \"6b1b4c44-e390-4a48-aa8a-84f5509ef99e\") " Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.787088 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-kube-api-access-xlb45" (OuterVolumeSpecName: "kube-api-access-xlb45") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "kube-api-access-xlb45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.787852 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.806588 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.809506 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.810031 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.810407 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.810570 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.819544 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.820644 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-inventory" (OuterVolumeSpecName: "inventory") pod "6b1b4c44-e390-4a48-aa8a-84f5509ef99e" (UID: "6b1b4c44-e390-4a48-aa8a-84f5509ef99e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885175 4728 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885204 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlb45\" (UniqueName: \"kubernetes.io/projected/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-kube-api-access-xlb45\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885215 4728 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885225 4728 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885237 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885251 4728 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885260 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885286 4728 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:20 crc kubenswrapper[4728]: I0125 06:12:20.885296 4728 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1b4c44-e390-4a48-aa8a-84f5509ef99e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.298946 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" event={"ID":"6b1b4c44-e390-4a48-aa8a-84f5509ef99e","Type":"ContainerDied","Data":"b866706876fa144fa35ed3fc46460c3e1562f69898b123f932d132f17a11e359"} Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.299476 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b866706876fa144fa35ed3fc46460c3e1562f69898b123f932d132f17a11e359" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.299027 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-prsds" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.408112 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st"] Jan 25 06:12:21 crc kubenswrapper[4728]: E0125 06:12:21.408587 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1b4c44-e390-4a48-aa8a-84f5509ef99e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.408607 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1b4c44-e390-4a48-aa8a-84f5509ef99e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.408805 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1b4c44-e390-4a48-aa8a-84f5509ef99e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.409587 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.414118 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-brzxc" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.414338 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.414366 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.417116 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.417354 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.417780 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st"] Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.495768 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.495812 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.495846 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.495908 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.495942 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.495968 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p9xv\" (UniqueName: \"kubernetes.io/projected/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-kube-api-access-6p9xv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.496052 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.597729 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.597764 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.597790 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.597833 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.597856 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.597877 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p9xv\" (UniqueName: \"kubernetes.io/projected/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-kube-api-access-6p9xv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.597924 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.603385 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.603517 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.603770 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.604383 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.605334 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.605315 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.615047 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p9xv\" (UniqueName: \"kubernetes.io/projected/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-kube-api-access-6p9xv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bw6st\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:21 crc kubenswrapper[4728]: I0125 06:12:21.723795 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:12:22 crc kubenswrapper[4728]: I0125 06:12:22.198869 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st"] Jan 25 06:12:22 crc kubenswrapper[4728]: W0125 06:12:22.201449 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46d9d0ef_f0c0_45c4_8497_5cca3ea0ff76.slice/crio-0323b5a720579d08ba4bb7cbf944a3816909828fcadf5b4ab013a15eeb111c28 WatchSource:0}: Error finding container 0323b5a720579d08ba4bb7cbf944a3816909828fcadf5b4ab013a15eeb111c28: Status 404 returned error can't find the container with id 0323b5a720579d08ba4bb7cbf944a3816909828fcadf5b4ab013a15eeb111c28 Jan 25 06:12:22 crc kubenswrapper[4728]: I0125 06:12:22.317210 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" event={"ID":"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76","Type":"ContainerStarted","Data":"0323b5a720579d08ba4bb7cbf944a3816909828fcadf5b4ab013a15eeb111c28"} Jan 25 06:12:23 crc kubenswrapper[4728]: I0125 06:12:23.337751 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" event={"ID":"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76","Type":"ContainerStarted","Data":"b79dca4c7f7293f7bd404f2254f02d6ce7013e7a89e9c637a428488dded1ecaa"} Jan 25 06:12:23 crc kubenswrapper[4728]: I0125 06:12:23.356891 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" podStartSLOduration=1.899518698 podStartE2EDuration="2.356874291s" podCreationTimestamp="2026-01-25 06:12:21 +0000 UTC" firstStartedPulling="2026-01-25 06:12:22.204413154 +0000 UTC m=+2033.240291125" lastFinishedPulling="2026-01-25 06:12:22.661768739 +0000 UTC m=+2033.697646718" observedRunningTime="2026-01-25 06:12:23.347440774 +0000 UTC m=+2034.383318754" watchObservedRunningTime="2026-01-25 06:12:23.356874291 +0000 UTC m=+2034.392752271" Jan 25 06:13:12 crc kubenswrapper[4728]: I0125 06:13:12.932776 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7z4mm"] Jan 25 06:13:12 crc kubenswrapper[4728]: I0125 06:13:12.935586 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:12 crc kubenswrapper[4728]: I0125 06:13:12.948672 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7z4mm"] Jan 25 06:13:12 crc kubenswrapper[4728]: I0125 06:13:12.980872 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-utilities\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:12 crc kubenswrapper[4728]: I0125 06:13:12.980916 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mrps\" (UniqueName: \"kubernetes.io/projected/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-kube-api-access-7mrps\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:12 crc kubenswrapper[4728]: I0125 06:13:12.980960 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-catalog-content\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.082850 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-utilities\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.082895 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mrps\" (UniqueName: \"kubernetes.io/projected/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-kube-api-access-7mrps\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.082947 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-catalog-content\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.083392 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-utilities\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.083460 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-catalog-content\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.099479 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mrps\" (UniqueName: \"kubernetes.io/projected/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-kube-api-access-7mrps\") pod \"redhat-operators-7z4mm\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.252215 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.633886 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7z4mm"] Jan 25 06:13:13 crc kubenswrapper[4728]: I0125 06:13:13.767801 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7z4mm" event={"ID":"c610ba74-3bb5-4d4b-9001-bcdb70979dbc","Type":"ContainerStarted","Data":"45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6"} Jan 25 06:13:14 crc kubenswrapper[4728]: I0125 06:13:14.778256 4728 generic.go:334] "Generic (PLEG): container finished" podID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerID="6f946105e350bb40e75e87434bc4347fbd54a0efb9b1edf651c117dab6bfaeb6" exitCode=0 Jan 25 06:13:14 crc kubenswrapper[4728]: I0125 06:13:14.778400 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7z4mm" event={"ID":"c610ba74-3bb5-4d4b-9001-bcdb70979dbc","Type":"ContainerDied","Data":"6f946105e350bb40e75e87434bc4347fbd54a0efb9b1edf651c117dab6bfaeb6"} Jan 25 06:13:15 crc kubenswrapper[4728]: I0125 06:13:15.789110 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7z4mm" event={"ID":"c610ba74-3bb5-4d4b-9001-bcdb70979dbc","Type":"ContainerStarted","Data":"e00f974b1dfe4c4e68c7b66d4a506e81cc846d54aae41cc80df900c94daea25e"} Jan 25 06:13:17 crc kubenswrapper[4728]: I0125 06:13:17.806733 4728 generic.go:334] "Generic (PLEG): container finished" podID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerID="e00f974b1dfe4c4e68c7b66d4a506e81cc846d54aae41cc80df900c94daea25e" exitCode=0 Jan 25 06:13:17 crc kubenswrapper[4728]: I0125 06:13:17.806791 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7z4mm" event={"ID":"c610ba74-3bb5-4d4b-9001-bcdb70979dbc","Type":"ContainerDied","Data":"e00f974b1dfe4c4e68c7b66d4a506e81cc846d54aae41cc80df900c94daea25e"} Jan 25 06:13:18 crc kubenswrapper[4728]: I0125 06:13:18.819958 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7z4mm" event={"ID":"c610ba74-3bb5-4d4b-9001-bcdb70979dbc","Type":"ContainerStarted","Data":"1a25df4a0250255df4f626707b7677267466d124a50031774a5f339905b03ae2"} Jan 25 06:13:18 crc kubenswrapper[4728]: I0125 06:13:18.844868 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7z4mm" podStartSLOduration=3.324769179 podStartE2EDuration="6.844844716s" podCreationTimestamp="2026-01-25 06:13:12 +0000 UTC" firstStartedPulling="2026-01-25 06:13:14.780475599 +0000 UTC m=+2085.816353570" lastFinishedPulling="2026-01-25 06:13:18.300551126 +0000 UTC m=+2089.336429107" observedRunningTime="2026-01-25 06:13:18.835487032 +0000 UTC m=+2089.871365011" watchObservedRunningTime="2026-01-25 06:13:18.844844716 +0000 UTC m=+2089.880722695" Jan 25 06:13:23 crc kubenswrapper[4728]: I0125 06:13:23.252936 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:23 crc kubenswrapper[4728]: I0125 06:13:23.253447 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:23 crc kubenswrapper[4728]: I0125 06:13:23.290968 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:23 crc kubenswrapper[4728]: I0125 06:13:23.911433 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:28 crc kubenswrapper[4728]: I0125 06:13:28.524980 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7z4mm"] Jan 25 06:13:28 crc kubenswrapper[4728]: I0125 06:13:28.526034 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7z4mm" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerName="registry-server" containerID="cri-o://1a25df4a0250255df4f626707b7677267466d124a50031774a5f339905b03ae2" gracePeriod=2 Jan 25 06:13:28 crc kubenswrapper[4728]: I0125 06:13:28.914635 4728 generic.go:334] "Generic (PLEG): container finished" podID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerID="1a25df4a0250255df4f626707b7677267466d124a50031774a5f339905b03ae2" exitCode=0 Jan 25 06:13:28 crc kubenswrapper[4728]: I0125 06:13:28.914718 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7z4mm" event={"ID":"c610ba74-3bb5-4d4b-9001-bcdb70979dbc","Type":"ContainerDied","Data":"1a25df4a0250255df4f626707b7677267466d124a50031774a5f339905b03ae2"} Jan 25 06:13:28 crc kubenswrapper[4728]: I0125 06:13:28.914915 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7z4mm" event={"ID":"c610ba74-3bb5-4d4b-9001-bcdb70979dbc","Type":"ContainerDied","Data":"45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6"} Jan 25 06:13:28 crc kubenswrapper[4728]: I0125 06:13:28.914934 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6" Jan 25 06:13:28 crc kubenswrapper[4728]: I0125 06:13:28.917938 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.020866 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mrps\" (UniqueName: \"kubernetes.io/projected/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-kube-api-access-7mrps\") pod \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.020972 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-catalog-content\") pod \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.021104 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-utilities\") pod \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\" (UID: \"c610ba74-3bb5-4d4b-9001-bcdb70979dbc\") " Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.021914 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-utilities" (OuterVolumeSpecName: "utilities") pod "c610ba74-3bb5-4d4b-9001-bcdb70979dbc" (UID: "c610ba74-3bb5-4d4b-9001-bcdb70979dbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.028261 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-kube-api-access-7mrps" (OuterVolumeSpecName: "kube-api-access-7mrps") pod "c610ba74-3bb5-4d4b-9001-bcdb70979dbc" (UID: "c610ba74-3bb5-4d4b-9001-bcdb70979dbc"). InnerVolumeSpecName "kube-api-access-7mrps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.117461 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c610ba74-3bb5-4d4b-9001-bcdb70979dbc" (UID: "c610ba74-3bb5-4d4b-9001-bcdb70979dbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.123033 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.123060 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mrps\" (UniqueName: \"kubernetes.io/projected/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-kube-api-access-7mrps\") on node \"crc\" DevicePath \"\"" Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.123074 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c610ba74-3bb5-4d4b-9001-bcdb70979dbc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.922068 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7z4mm" Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.948054 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7z4mm"] Jan 25 06:13:29 crc kubenswrapper[4728]: I0125 06:13:29.953997 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7z4mm"] Jan 25 06:13:30 crc kubenswrapper[4728]: I0125 06:13:30.933222 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hhv7p"] Jan 25 06:13:30 crc kubenswrapper[4728]: E0125 06:13:30.933921 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerName="extract-content" Jan 25 06:13:30 crc kubenswrapper[4728]: I0125 06:13:30.933936 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerName="extract-content" Jan 25 06:13:30 crc kubenswrapper[4728]: E0125 06:13:30.933953 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerName="extract-utilities" Jan 25 06:13:30 crc kubenswrapper[4728]: I0125 06:13:30.933959 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerName="extract-utilities" Jan 25 06:13:30 crc kubenswrapper[4728]: E0125 06:13:30.933971 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerName="registry-server" Jan 25 06:13:30 crc kubenswrapper[4728]: I0125 06:13:30.933976 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerName="registry-server" Jan 25 06:13:30 crc kubenswrapper[4728]: I0125 06:13:30.934194 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" containerName="registry-server" Jan 25 06:13:30 crc kubenswrapper[4728]: I0125 06:13:30.935641 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:30 crc kubenswrapper[4728]: I0125 06:13:30.949331 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhv7p"] Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.071188 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-catalog-content\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.071240 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cscjh\" (UniqueName: \"kubernetes.io/projected/139877d7-358c-4806-89e3-94e2cc5ef1c3-kube-api-access-cscjh\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.071264 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-utilities\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.172943 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-catalog-content\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.172996 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cscjh\" (UniqueName: \"kubernetes.io/projected/139877d7-358c-4806-89e3-94e2cc5ef1c3-kube-api-access-cscjh\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.173019 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-utilities\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.173476 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-catalog-content\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.173512 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-utilities\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.189924 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cscjh\" (UniqueName: \"kubernetes.io/projected/139877d7-358c-4806-89e3-94e2cc5ef1c3-kube-api-access-cscjh\") pod \"redhat-marketplace-hhv7p\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.253751 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.337763 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c610ba74-3bb5-4d4b-9001-bcdb70979dbc" path="/var/lib/kubelet/pods/c610ba74-3bb5-4d4b-9001-bcdb70979dbc/volumes" Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.702447 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhv7p"] Jan 25 06:13:31 crc kubenswrapper[4728]: W0125 06:13:31.705136 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139877d7_358c_4806_89e3_94e2cc5ef1c3.slice/crio-02413cd30838b906f11674f165941033d9e125dab6b3d8be4b1c2777ce3216b5 WatchSource:0}: Error finding container 02413cd30838b906f11674f165941033d9e125dab6b3d8be4b1c2777ce3216b5: Status 404 returned error can't find the container with id 02413cd30838b906f11674f165941033d9e125dab6b3d8be4b1c2777ce3216b5 Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.941486 4728 generic.go:334] "Generic (PLEG): container finished" podID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerID="1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc" exitCode=0 Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.941554 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhv7p" event={"ID":"139877d7-358c-4806-89e3-94e2cc5ef1c3","Type":"ContainerDied","Data":"1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc"} Jan 25 06:13:31 crc kubenswrapper[4728]: I0125 06:13:31.941617 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhv7p" event={"ID":"139877d7-358c-4806-89e3-94e2cc5ef1c3","Type":"ContainerStarted","Data":"02413cd30838b906f11674f165941033d9e125dab6b3d8be4b1c2777ce3216b5"} Jan 25 06:13:32 crc kubenswrapper[4728]: I0125 06:13:32.951736 4728 generic.go:334] "Generic (PLEG): container finished" podID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerID="98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c" exitCode=0 Jan 25 06:13:32 crc kubenswrapper[4728]: I0125 06:13:32.951910 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhv7p" event={"ID":"139877d7-358c-4806-89e3-94e2cc5ef1c3","Type":"ContainerDied","Data":"98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c"} Jan 25 06:13:33 crc kubenswrapper[4728]: I0125 06:13:33.963278 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhv7p" event={"ID":"139877d7-358c-4806-89e3-94e2cc5ef1c3","Type":"ContainerStarted","Data":"6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a"} Jan 25 06:13:34 crc kubenswrapper[4728]: I0125 06:13:34.002126 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hhv7p" podStartSLOduration=2.471966721 podStartE2EDuration="4.002109302s" podCreationTimestamp="2026-01-25 06:13:30 +0000 UTC" firstStartedPulling="2026-01-25 06:13:31.943550039 +0000 UTC m=+2102.979428019" lastFinishedPulling="2026-01-25 06:13:33.473692619 +0000 UTC m=+2104.509570600" observedRunningTime="2026-01-25 06:13:33.994288206 +0000 UTC m=+2105.030166186" watchObservedRunningTime="2026-01-25 06:13:34.002109302 +0000 UTC m=+2105.037987282" Jan 25 06:13:34 crc kubenswrapper[4728]: E0125 06:13:34.483098 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice/crio-45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice\": RecentStats: unable to find data in memory cache]" Jan 25 06:13:41 crc kubenswrapper[4728]: I0125 06:13:41.254066 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:41 crc kubenswrapper[4728]: I0125 06:13:41.254719 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:41 crc kubenswrapper[4728]: I0125 06:13:41.292012 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:42 crc kubenswrapper[4728]: I0125 06:13:42.072803 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:42 crc kubenswrapper[4728]: I0125 06:13:42.125957 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhv7p"] Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.053603 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hhv7p" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerName="registry-server" containerID="cri-o://6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a" gracePeriod=2 Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.458539 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.488986 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-catalog-content\") pod \"139877d7-358c-4806-89e3-94e2cc5ef1c3\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.489202 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cscjh\" (UniqueName: \"kubernetes.io/projected/139877d7-358c-4806-89e3-94e2cc5ef1c3-kube-api-access-cscjh\") pod \"139877d7-358c-4806-89e3-94e2cc5ef1c3\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.489247 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-utilities\") pod \"139877d7-358c-4806-89e3-94e2cc5ef1c3\" (UID: \"139877d7-358c-4806-89e3-94e2cc5ef1c3\") " Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.490465 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-utilities" (OuterVolumeSpecName: "utilities") pod "139877d7-358c-4806-89e3-94e2cc5ef1c3" (UID: "139877d7-358c-4806-89e3-94e2cc5ef1c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.506077 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/139877d7-358c-4806-89e3-94e2cc5ef1c3-kube-api-access-cscjh" (OuterVolumeSpecName: "kube-api-access-cscjh") pod "139877d7-358c-4806-89e3-94e2cc5ef1c3" (UID: "139877d7-358c-4806-89e3-94e2cc5ef1c3"). InnerVolumeSpecName "kube-api-access-cscjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.507235 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "139877d7-358c-4806-89e3-94e2cc5ef1c3" (UID: "139877d7-358c-4806-89e3-94e2cc5ef1c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.591858 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.591889 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139877d7-358c-4806-89e3-94e2cc5ef1c3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:13:44 crc kubenswrapper[4728]: I0125 06:13:44.591901 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cscjh\" (UniqueName: \"kubernetes.io/projected/139877d7-358c-4806-89e3-94e2cc5ef1c3-kube-api-access-cscjh\") on node \"crc\" DevicePath \"\"" Jan 25 06:13:44 crc kubenswrapper[4728]: E0125 06:13:44.705614 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice/crio-45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6\": RecentStats: unable to find data in memory cache]" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.064914 4728 generic.go:334] "Generic (PLEG): container finished" podID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerID="6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a" exitCode=0 Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.064988 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhv7p" event={"ID":"139877d7-358c-4806-89e3-94e2cc5ef1c3","Type":"ContainerDied","Data":"6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a"} Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.065393 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhv7p" event={"ID":"139877d7-358c-4806-89e3-94e2cc5ef1c3","Type":"ContainerDied","Data":"02413cd30838b906f11674f165941033d9e125dab6b3d8be4b1c2777ce3216b5"} Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.065459 4728 scope.go:117] "RemoveContainer" containerID="6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.065006 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhv7p" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.097854 4728 scope.go:117] "RemoveContainer" containerID="98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.109556 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhv7p"] Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.118306 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhv7p"] Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.120598 4728 scope.go:117] "RemoveContainer" containerID="1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.153311 4728 scope.go:117] "RemoveContainer" containerID="6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a" Jan 25 06:13:45 crc kubenswrapper[4728]: E0125 06:13:45.153770 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a\": container with ID starting with 6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a not found: ID does not exist" containerID="6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.153802 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a"} err="failed to get container status \"6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a\": rpc error: code = NotFound desc = could not find container \"6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a\": container with ID starting with 6b7ec690bd81e49aebd0a0dab659e73af6e9bfd86f67d29f2fc86db563816c7a not found: ID does not exist" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.153825 4728 scope.go:117] "RemoveContainer" containerID="98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c" Jan 25 06:13:45 crc kubenswrapper[4728]: E0125 06:13:45.154056 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c\": container with ID starting with 98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c not found: ID does not exist" containerID="98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.154076 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c"} err="failed to get container status \"98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c\": rpc error: code = NotFound desc = could not find container \"98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c\": container with ID starting with 98cb477f338b8d8d54c63dd9d3c822e82e1ff68e1b0a6b8fd0bae97b3bf4f49c not found: ID does not exist" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.154089 4728 scope.go:117] "RemoveContainer" containerID="1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc" Jan 25 06:13:45 crc kubenswrapper[4728]: E0125 06:13:45.154297 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc\": container with ID starting with 1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc not found: ID does not exist" containerID="1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.154315 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc"} err="failed to get container status \"1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc\": rpc error: code = NotFound desc = could not find container \"1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc\": container with ID starting with 1e51202c66d2c688c929bbf71b5b7c246bed5e2fb4bfb7aba153200520da36bc not found: ID does not exist" Jan 25 06:13:45 crc kubenswrapper[4728]: I0125 06:13:45.339385 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" path="/var/lib/kubelet/pods/139877d7-358c-4806-89e3-94e2cc5ef1c3/volumes" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.935032 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qkm8z"] Jan 25 06:13:46 crc kubenswrapper[4728]: E0125 06:13:46.935696 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerName="extract-content" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.935712 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerName="extract-content" Jan 25 06:13:46 crc kubenswrapper[4728]: E0125 06:13:46.935721 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerName="registry-server" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.935727 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerName="registry-server" Jan 25 06:13:46 crc kubenswrapper[4728]: E0125 06:13:46.935735 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerName="extract-utilities" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.935740 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerName="extract-utilities" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.935946 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="139877d7-358c-4806-89e3-94e2cc5ef1c3" containerName="registry-server" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.937175 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.939112 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qkdr\" (UniqueName: \"kubernetes.io/projected/d625c185-a245-42b6-9de7-16382b34c8c2-kube-api-access-8qkdr\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.939167 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-catalog-content\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.939221 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-utilities\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:46 crc kubenswrapper[4728]: I0125 06:13:46.944778 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qkm8z"] Jan 25 06:13:47 crc kubenswrapper[4728]: I0125 06:13:47.040778 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qkdr\" (UniqueName: \"kubernetes.io/projected/d625c185-a245-42b6-9de7-16382b34c8c2-kube-api-access-8qkdr\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:47 crc kubenswrapper[4728]: I0125 06:13:47.040836 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-catalog-content\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:47 crc kubenswrapper[4728]: I0125 06:13:47.040888 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-utilities\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:47 crc kubenswrapper[4728]: I0125 06:13:47.041386 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-utilities\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:47 crc kubenswrapper[4728]: I0125 06:13:47.041611 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-catalog-content\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:47 crc kubenswrapper[4728]: I0125 06:13:47.059429 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qkdr\" (UniqueName: \"kubernetes.io/projected/d625c185-a245-42b6-9de7-16382b34c8c2-kube-api-access-8qkdr\") pod \"certified-operators-qkm8z\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:47 crc kubenswrapper[4728]: I0125 06:13:47.259354 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:47 crc kubenswrapper[4728]: I0125 06:13:47.823796 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qkm8z"] Jan 25 06:13:47 crc kubenswrapper[4728]: W0125 06:13:47.828281 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd625c185_a245_42b6_9de7_16382b34c8c2.slice/crio-73de05f334182d01754b92550213fa5f2adda596898e161c3dd4e5e83da1ab7b WatchSource:0}: Error finding container 73de05f334182d01754b92550213fa5f2adda596898e161c3dd4e5e83da1ab7b: Status 404 returned error can't find the container with id 73de05f334182d01754b92550213fa5f2adda596898e161c3dd4e5e83da1ab7b Jan 25 06:13:48 crc kubenswrapper[4728]: I0125 06:13:48.093165 4728 generic.go:334] "Generic (PLEG): container finished" podID="d625c185-a245-42b6-9de7-16382b34c8c2" containerID="38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83" exitCode=0 Jan 25 06:13:48 crc kubenswrapper[4728]: I0125 06:13:48.093342 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkm8z" event={"ID":"d625c185-a245-42b6-9de7-16382b34c8c2","Type":"ContainerDied","Data":"38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83"} Jan 25 06:13:48 crc kubenswrapper[4728]: I0125 06:13:48.093612 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkm8z" event={"ID":"d625c185-a245-42b6-9de7-16382b34c8c2","Type":"ContainerStarted","Data":"73de05f334182d01754b92550213fa5f2adda596898e161c3dd4e5e83da1ab7b"} Jan 25 06:13:49 crc kubenswrapper[4728]: I0125 06:13:49.104984 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkm8z" event={"ID":"d625c185-a245-42b6-9de7-16382b34c8c2","Type":"ContainerStarted","Data":"0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f"} Jan 25 06:13:49 crc kubenswrapper[4728]: I0125 06:13:49.931525 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q7ldz"] Jan 25 06:13:49 crc kubenswrapper[4728]: I0125 06:13:49.934417 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:49 crc kubenswrapper[4728]: I0125 06:13:49.943648 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q7ldz"] Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.004509 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a10bfcaa-828b-444b-948b-1063ce7b114f-utilities\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.004632 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a10bfcaa-828b-444b-948b-1063ce7b114f-catalog-content\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.004677 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6745\" (UniqueName: \"kubernetes.io/projected/a10bfcaa-828b-444b-948b-1063ce7b114f-kube-api-access-w6745\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.106585 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a10bfcaa-828b-444b-948b-1063ce7b114f-utilities\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.106709 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a10bfcaa-828b-444b-948b-1063ce7b114f-catalog-content\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.106778 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6745\" (UniqueName: \"kubernetes.io/projected/a10bfcaa-828b-444b-948b-1063ce7b114f-kube-api-access-w6745\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.107116 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a10bfcaa-828b-444b-948b-1063ce7b114f-utilities\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.107511 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a10bfcaa-828b-444b-948b-1063ce7b114f-catalog-content\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.112892 4728 generic.go:334] "Generic (PLEG): container finished" podID="d625c185-a245-42b6-9de7-16382b34c8c2" containerID="0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f" exitCode=0 Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.112940 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkm8z" event={"ID":"d625c185-a245-42b6-9de7-16382b34c8c2","Type":"ContainerDied","Data":"0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f"} Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.128340 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6745\" (UniqueName: \"kubernetes.io/projected/a10bfcaa-828b-444b-948b-1063ce7b114f-kube-api-access-w6745\") pod \"community-operators-q7ldz\" (UID: \"a10bfcaa-828b-444b-948b-1063ce7b114f\") " pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.254146 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:13:50 crc kubenswrapper[4728]: I0125 06:13:50.704152 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q7ldz"] Jan 25 06:13:50 crc kubenswrapper[4728]: W0125 06:13:50.706999 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda10bfcaa_828b_444b_948b_1063ce7b114f.slice/crio-81e4fb221a8e1e572d6f1536aab7c223e9c096f720be3a8f7291506f8c03ad20 WatchSource:0}: Error finding container 81e4fb221a8e1e572d6f1536aab7c223e9c096f720be3a8f7291506f8c03ad20: Status 404 returned error can't find the container with id 81e4fb221a8e1e572d6f1536aab7c223e9c096f720be3a8f7291506f8c03ad20 Jan 25 06:13:51 crc kubenswrapper[4728]: I0125 06:13:51.132281 4728 generic.go:334] "Generic (PLEG): container finished" podID="a10bfcaa-828b-444b-948b-1063ce7b114f" containerID="1ffe29aff32975f761702544814fc54c7854e5a793d4a3609be4e77e18aa3405" exitCode=0 Jan 25 06:13:51 crc kubenswrapper[4728]: I0125 06:13:51.132389 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7ldz" event={"ID":"a10bfcaa-828b-444b-948b-1063ce7b114f","Type":"ContainerDied","Data":"1ffe29aff32975f761702544814fc54c7854e5a793d4a3609be4e77e18aa3405"} Jan 25 06:13:51 crc kubenswrapper[4728]: I0125 06:13:51.132447 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7ldz" event={"ID":"a10bfcaa-828b-444b-948b-1063ce7b114f","Type":"ContainerStarted","Data":"81e4fb221a8e1e572d6f1536aab7c223e9c096f720be3a8f7291506f8c03ad20"} Jan 25 06:13:51 crc kubenswrapper[4728]: I0125 06:13:51.140773 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkm8z" event={"ID":"d625c185-a245-42b6-9de7-16382b34c8c2","Type":"ContainerStarted","Data":"6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1"} Jan 25 06:13:51 crc kubenswrapper[4728]: I0125 06:13:51.182036 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qkm8z" podStartSLOduration=2.5620453100000002 podStartE2EDuration="5.182008746s" podCreationTimestamp="2026-01-25 06:13:46 +0000 UTC" firstStartedPulling="2026-01-25 06:13:48.09545298 +0000 UTC m=+2119.131330959" lastFinishedPulling="2026-01-25 06:13:50.715416415 +0000 UTC m=+2121.751294395" observedRunningTime="2026-01-25 06:13:51.172050469 +0000 UTC m=+2122.207928449" watchObservedRunningTime="2026-01-25 06:13:51.182008746 +0000 UTC m=+2122.217886716" Jan 25 06:13:54 crc kubenswrapper[4728]: E0125 06:13:54.924269 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice/crio-45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6\": RecentStats: unable to find data in memory cache]" Jan 25 06:13:55 crc kubenswrapper[4728]: I0125 06:13:55.176981 4728 generic.go:334] "Generic (PLEG): container finished" podID="a10bfcaa-828b-444b-948b-1063ce7b114f" containerID="a6183b20a16feb3f1f5c6dba203147dc1a27d5c79d976cd280eff2602cf26c7f" exitCode=0 Jan 25 06:13:55 crc kubenswrapper[4728]: I0125 06:13:55.177038 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7ldz" event={"ID":"a10bfcaa-828b-444b-948b-1063ce7b114f","Type":"ContainerDied","Data":"a6183b20a16feb3f1f5c6dba203147dc1a27d5c79d976cd280eff2602cf26c7f"} Jan 25 06:13:56 crc kubenswrapper[4728]: I0125 06:13:56.186350 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7ldz" event={"ID":"a10bfcaa-828b-444b-948b-1063ce7b114f","Type":"ContainerStarted","Data":"3c74121f3c9cac0ce2f7031c9fb00c9b41c9f491edc28ab5763c24556f1fb442"} Jan 25 06:13:56 crc kubenswrapper[4728]: I0125 06:13:56.212875 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q7ldz" podStartSLOduration=2.656360658 podStartE2EDuration="7.212859146s" podCreationTimestamp="2026-01-25 06:13:49 +0000 UTC" firstStartedPulling="2026-01-25 06:13:51.136147303 +0000 UTC m=+2122.172025282" lastFinishedPulling="2026-01-25 06:13:55.69264579 +0000 UTC m=+2126.728523770" observedRunningTime="2026-01-25 06:13:56.204852488 +0000 UTC m=+2127.240730468" watchObservedRunningTime="2026-01-25 06:13:56.212859146 +0000 UTC m=+2127.248737125" Jan 25 06:13:57 crc kubenswrapper[4728]: I0125 06:13:57.260114 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:57 crc kubenswrapper[4728]: I0125 06:13:57.260586 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:57 crc kubenswrapper[4728]: I0125 06:13:57.302807 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:58 crc kubenswrapper[4728]: I0125 06:13:58.243022 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:13:58 crc kubenswrapper[4728]: I0125 06:13:58.925807 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qkm8z"] Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.221648 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qkm8z" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" containerName="registry-server" containerID="cri-o://6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1" gracePeriod=2 Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.254386 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.254433 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.288718 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.683910 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.776360 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qkdr\" (UniqueName: \"kubernetes.io/projected/d625c185-a245-42b6-9de7-16382b34c8c2-kube-api-access-8qkdr\") pod \"d625c185-a245-42b6-9de7-16382b34c8c2\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.776555 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-utilities\") pod \"d625c185-a245-42b6-9de7-16382b34c8c2\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.776602 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-catalog-content\") pod \"d625c185-a245-42b6-9de7-16382b34c8c2\" (UID: \"d625c185-a245-42b6-9de7-16382b34c8c2\") " Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.777126 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-utilities" (OuterVolumeSpecName: "utilities") pod "d625c185-a245-42b6-9de7-16382b34c8c2" (UID: "d625c185-a245-42b6-9de7-16382b34c8c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.777768 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.782230 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d625c185-a245-42b6-9de7-16382b34c8c2-kube-api-access-8qkdr" (OuterVolumeSpecName: "kube-api-access-8qkdr") pod "d625c185-a245-42b6-9de7-16382b34c8c2" (UID: "d625c185-a245-42b6-9de7-16382b34c8c2"). InnerVolumeSpecName "kube-api-access-8qkdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.812472 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d625c185-a245-42b6-9de7-16382b34c8c2" (UID: "d625c185-a245-42b6-9de7-16382b34c8c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.879279 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d625c185-a245-42b6-9de7-16382b34c8c2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:00 crc kubenswrapper[4728]: I0125 06:14:00.879311 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qkdr\" (UniqueName: \"kubernetes.io/projected/d625c185-a245-42b6-9de7-16382b34c8c2-kube-api-access-8qkdr\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.232901 4728 generic.go:334] "Generic (PLEG): container finished" podID="d625c185-a245-42b6-9de7-16382b34c8c2" containerID="6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1" exitCode=0 Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.233487 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkm8z" event={"ID":"d625c185-a245-42b6-9de7-16382b34c8c2","Type":"ContainerDied","Data":"6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1"} Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.233582 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qkm8z" event={"ID":"d625c185-a245-42b6-9de7-16382b34c8c2","Type":"ContainerDied","Data":"73de05f334182d01754b92550213fa5f2adda596898e161c3dd4e5e83da1ab7b"} Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.233584 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qkm8z" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.233610 4728 scope.go:117] "RemoveContainer" containerID="6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.277046 4728 scope.go:117] "RemoveContainer" containerID="0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.279519 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qkm8z"] Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.285596 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qkm8z"] Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.290904 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q7ldz" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.314518 4728 scope.go:117] "RemoveContainer" containerID="38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.338263 4728 scope.go:117] "RemoveContainer" containerID="6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1" Jan 25 06:14:01 crc kubenswrapper[4728]: E0125 06:14:01.338559 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1\": container with ID starting with 6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1 not found: ID does not exist" containerID="6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.338578 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" path="/var/lib/kubelet/pods/d625c185-a245-42b6-9de7-16382b34c8c2/volumes" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.338598 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1"} err="failed to get container status \"6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1\": rpc error: code = NotFound desc = could not find container \"6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1\": container with ID starting with 6af5d4dbec815910b4e4dc9d7afb146ebc14892e546cf7473c8234ffecf0e6a1 not found: ID does not exist" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.338624 4728 scope.go:117] "RemoveContainer" containerID="0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f" Jan 25 06:14:01 crc kubenswrapper[4728]: E0125 06:14:01.338941 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f\": container with ID starting with 0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f not found: ID does not exist" containerID="0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.338963 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f"} err="failed to get container status \"0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f\": rpc error: code = NotFound desc = could not find container \"0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f\": container with ID starting with 0a4a823e8927f97e0d2174ba4d750c6245975de5c97190ad8d4ee76ec6d8596f not found: ID does not exist" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.338980 4728 scope.go:117] "RemoveContainer" containerID="38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83" Jan 25 06:14:01 crc kubenswrapper[4728]: E0125 06:14:01.339231 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83\": container with ID starting with 38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83 not found: ID does not exist" containerID="38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83" Jan 25 06:14:01 crc kubenswrapper[4728]: I0125 06:14:01.339256 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83"} err="failed to get container status \"38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83\": rpc error: code = NotFound desc = could not find container \"38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83\": container with ID starting with 38688509a6b3cf9e2bfd4a21fe0faaae458636dacd94365f663a728aa08e0f83 not found: ID does not exist" Jan 25 06:14:02 crc kubenswrapper[4728]: I0125 06:14:02.351870 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q7ldz"] Jan 25 06:14:02 crc kubenswrapper[4728]: I0125 06:14:02.727340 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b255c"] Jan 25 06:14:02 crc kubenswrapper[4728]: I0125 06:14:02.727676 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b255c" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerName="registry-server" containerID="cri-o://0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755" gracePeriod=2 Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.255575 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b255c" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.257271 4728 generic.go:334] "Generic (PLEG): container finished" podID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerID="0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755" exitCode=0 Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.257359 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b255c" event={"ID":"8b2f959d-89c4-43df-98a5-b8c37490dff7","Type":"ContainerDied","Data":"0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755"} Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.257423 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b255c" event={"ID":"8b2f959d-89c4-43df-98a5-b8c37490dff7","Type":"ContainerDied","Data":"64099e9c7eecee19c1a17622b89187559713cbe3f1879864d8e2e984ca0f567f"} Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.257447 4728 scope.go:117] "RemoveContainer" containerID="0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.277911 4728 scope.go:117] "RemoveContainer" containerID="477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.305518 4728 scope.go:117] "RemoveContainer" containerID="c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.328353 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-catalog-content\") pod \"8b2f959d-89c4-43df-98a5-b8c37490dff7\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.328406 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-utilities\") pod \"8b2f959d-89c4-43df-98a5-b8c37490dff7\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.328444 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87gpn\" (UniqueName: \"kubernetes.io/projected/8b2f959d-89c4-43df-98a5-b8c37490dff7-kube-api-access-87gpn\") pod \"8b2f959d-89c4-43df-98a5-b8c37490dff7\" (UID: \"8b2f959d-89c4-43df-98a5-b8c37490dff7\") " Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.328940 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-utilities" (OuterVolumeSpecName: "utilities") pod "8b2f959d-89c4-43df-98a5-b8c37490dff7" (UID: "8b2f959d-89c4-43df-98a5-b8c37490dff7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.335521 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2f959d-89c4-43df-98a5-b8c37490dff7-kube-api-access-87gpn" (OuterVolumeSpecName: "kube-api-access-87gpn") pod "8b2f959d-89c4-43df-98a5-b8c37490dff7" (UID: "8b2f959d-89c4-43df-98a5-b8c37490dff7"). InnerVolumeSpecName "kube-api-access-87gpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.352956 4728 scope.go:117] "RemoveContainer" containerID="0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755" Jan 25 06:14:03 crc kubenswrapper[4728]: E0125 06:14:03.355410 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755\": container with ID starting with 0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755 not found: ID does not exist" containerID="0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.355461 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755"} err="failed to get container status \"0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755\": rpc error: code = NotFound desc = could not find container \"0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755\": container with ID starting with 0ce8cead89bdb62ed4fa67db97121594b6f666dfdebed11af134d23833335755 not found: ID does not exist" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.355502 4728 scope.go:117] "RemoveContainer" containerID="477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4" Jan 25 06:14:03 crc kubenswrapper[4728]: E0125 06:14:03.356099 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4\": container with ID starting with 477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4 not found: ID does not exist" containerID="477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.356146 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4"} err="failed to get container status \"477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4\": rpc error: code = NotFound desc = could not find container \"477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4\": container with ID starting with 477a03d631f46f5d339b910371f3f70e77de8646e2e8b15c8022662c65d589a4 not found: ID does not exist" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.356178 4728 scope.go:117] "RemoveContainer" containerID="c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa" Jan 25 06:14:03 crc kubenswrapper[4728]: E0125 06:14:03.356566 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa\": container with ID starting with c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa not found: ID does not exist" containerID="c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.356609 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa"} err="failed to get container status \"c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa\": rpc error: code = NotFound desc = could not find container \"c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa\": container with ID starting with c0895e37e5362e170e7a24b1216516710ca249dae316533ff8120139c629c7fa not found: ID does not exist" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.372529 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b2f959d-89c4-43df-98a5-b8c37490dff7" (UID: "8b2f959d-89c4-43df-98a5-b8c37490dff7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.432108 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.432137 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b2f959d-89c4-43df-98a5-b8c37490dff7-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:03 crc kubenswrapper[4728]: I0125 06:14:03.432146 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87gpn\" (UniqueName: \"kubernetes.io/projected/8b2f959d-89c4-43df-98a5-b8c37490dff7-kube-api-access-87gpn\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:04 crc kubenswrapper[4728]: I0125 06:14:04.265390 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b255c" Jan 25 06:14:04 crc kubenswrapper[4728]: I0125 06:14:04.295557 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b255c"] Jan 25 06:14:04 crc kubenswrapper[4728]: I0125 06:14:04.302359 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b255c"] Jan 25 06:14:05 crc kubenswrapper[4728]: E0125 06:14:05.146696 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice/crio-45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice\": RecentStats: unable to find data in memory cache]" Jan 25 06:14:05 crc kubenswrapper[4728]: I0125 06:14:05.337596 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" path="/var/lib/kubelet/pods/8b2f959d-89c4-43df-98a5-b8c37490dff7/volumes" Jan 25 06:14:08 crc kubenswrapper[4728]: I0125 06:14:08.298357 4728 generic.go:334] "Generic (PLEG): container finished" podID="46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" containerID="b79dca4c7f7293f7bd404f2254f02d6ce7013e7a89e9c637a428488dded1ecaa" exitCode=0 Jan 25 06:14:08 crc kubenswrapper[4728]: I0125 06:14:08.298435 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" event={"ID":"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76","Type":"ContainerDied","Data":"b79dca4c7f7293f7bd404f2254f02d6ce7013e7a89e9c637a428488dded1ecaa"} Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.706014 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.768256 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-inventory\") pod \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.768392 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ssh-key-openstack-edpm-ipam\") pod \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.768598 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1\") pod \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.768726 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-2\") pod \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.768797 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p9xv\" (UniqueName: \"kubernetes.io/projected/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-kube-api-access-6p9xv\") pod \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.768898 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-telemetry-combined-ca-bundle\") pod \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.768937 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-0\") pod \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.775032 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" (UID: "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.775044 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-kube-api-access-6p9xv" (OuterVolumeSpecName: "kube-api-access-6p9xv") pod "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" (UID: "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76"). InnerVolumeSpecName "kube-api-access-6p9xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:14:09 crc kubenswrapper[4728]: E0125 06:14:09.793814 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1 podName:46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76 nodeName:}" failed. No retries permitted until 2026-01-25 06:14:10.293786038 +0000 UTC m=+2141.329664019 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ceilometer-compute-config-data-1" (UniqueName: "kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1") pod "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" (UID: "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76") : error deleting /var/lib/kubelet/pods/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76/volume-subpaths: remove /var/lib/kubelet/pods/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76/volume-subpaths: no such file or directory Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.795642 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-inventory" (OuterVolumeSpecName: "inventory") pod "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" (UID: "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.796293 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" (UID: "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.796777 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" (UID: "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.796867 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" (UID: "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.873845 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.874079 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p9xv\" (UniqueName: \"kubernetes.io/projected/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-kube-api-access-6p9xv\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.874153 4728 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.874209 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.874269 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-inventory\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:09 crc kubenswrapper[4728]: I0125 06:14:09.874418 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:10 crc kubenswrapper[4728]: I0125 06:14:10.316473 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" event={"ID":"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76","Type":"ContainerDied","Data":"0323b5a720579d08ba4bb7cbf944a3816909828fcadf5b4ab013a15eeb111c28"} Jan 25 06:14:10 crc kubenswrapper[4728]: I0125 06:14:10.316515 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0323b5a720579d08ba4bb7cbf944a3816909828fcadf5b4ab013a15eeb111c28" Jan 25 06:14:10 crc kubenswrapper[4728]: I0125 06:14:10.316515 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bw6st" Jan 25 06:14:10 crc kubenswrapper[4728]: I0125 06:14:10.387549 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1\") pod \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\" (UID: \"46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76\") " Jan 25 06:14:10 crc kubenswrapper[4728]: I0125 06:14:10.391216 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" (UID: "46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:14:10 crc kubenswrapper[4728]: I0125 06:14:10.491301 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 25 06:14:12 crc kubenswrapper[4728]: I0125 06:14:12.899269 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:14:12 crc kubenswrapper[4728]: I0125 06:14:12.899755 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:14:15 crc kubenswrapper[4728]: E0125 06:14:15.372640 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice/crio-45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice\": RecentStats: unable to find data in memory cache]" Jan 25 06:14:25 crc kubenswrapper[4728]: E0125 06:14:25.566232 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice/crio-45075ff7bd1b28d9c5282583b434f5764da656b9471e405588c4364f127d48a6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc610ba74_3bb5_4d4b_9001_bcdb70979dbc.slice\": RecentStats: unable to find data in memory cache]" Jan 25 06:14:42 crc kubenswrapper[4728]: I0125 06:14:42.898984 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:14:42 crc kubenswrapper[4728]: I0125 06:14:42.899466 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.143837 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc"] Jan 25 06:15:00 crc kubenswrapper[4728]: E0125 06:15:00.145289 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" containerName="registry-server" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145308 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" containerName="registry-server" Jan 25 06:15:00 crc kubenswrapper[4728]: E0125 06:15:00.145349 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" containerName="extract-content" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145357 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" containerName="extract-content" Jan 25 06:15:00 crc kubenswrapper[4728]: E0125 06:15:00.145369 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerName="extract-utilities" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145377 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerName="extract-utilities" Jan 25 06:15:00 crc kubenswrapper[4728]: E0125 06:15:00.145410 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" containerName="extract-utilities" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145417 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" containerName="extract-utilities" Jan 25 06:15:00 crc kubenswrapper[4728]: E0125 06:15:00.145434 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145442 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 25 06:15:00 crc kubenswrapper[4728]: E0125 06:15:00.145458 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerName="registry-server" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145466 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerName="registry-server" Jan 25 06:15:00 crc kubenswrapper[4728]: E0125 06:15:00.145476 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerName="extract-content" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145483 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerName="extract-content" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145722 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145745 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d625c185-a245-42b6-9de7-16382b34c8c2" containerName="registry-server" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.145762 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2f959d-89c4-43df-98a5-b8c37490dff7" containerName="registry-server" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.146649 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.149100 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.149445 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.168700 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc"] Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.291876 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b61f9c-934f-47c6-bd8c-7967c8adc48e-secret-volume\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.293194 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfgrl\" (UniqueName: \"kubernetes.io/projected/28b61f9c-934f-47c6-bd8c-7967c8adc48e-kube-api-access-tfgrl\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.293807 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b61f9c-934f-47c6-bd8c-7967c8adc48e-config-volume\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.395842 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfgrl\" (UniqueName: \"kubernetes.io/projected/28b61f9c-934f-47c6-bd8c-7967c8adc48e-kube-api-access-tfgrl\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.396021 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b61f9c-934f-47c6-bd8c-7967c8adc48e-config-volume\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.396064 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b61f9c-934f-47c6-bd8c-7967c8adc48e-secret-volume\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.397271 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b61f9c-934f-47c6-bd8c-7967c8adc48e-config-volume\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.402116 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b61f9c-934f-47c6-bd8c-7967c8adc48e-secret-volume\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.411055 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfgrl\" (UniqueName: \"kubernetes.io/projected/28b61f9c-934f-47c6-bd8c-7967c8adc48e-kube-api-access-tfgrl\") pod \"collect-profiles-29488695-sgmlc\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.479546 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:00 crc kubenswrapper[4728]: I0125 06:15:00.885981 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc"] Jan 25 06:15:01 crc kubenswrapper[4728]: I0125 06:15:01.779594 4728 generic.go:334] "Generic (PLEG): container finished" podID="28b61f9c-934f-47c6-bd8c-7967c8adc48e" containerID="dc283aa4805019803a70d07c168fcc9c73f682a7b9f791a634f91ef8e99911e1" exitCode=0 Jan 25 06:15:01 crc kubenswrapper[4728]: I0125 06:15:01.779847 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" event={"ID":"28b61f9c-934f-47c6-bd8c-7967c8adc48e","Type":"ContainerDied","Data":"dc283aa4805019803a70d07c168fcc9c73f682a7b9f791a634f91ef8e99911e1"} Jan 25 06:15:01 crc kubenswrapper[4728]: I0125 06:15:01.780135 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" event={"ID":"28b61f9c-934f-47c6-bd8c-7967c8adc48e","Type":"ContainerStarted","Data":"8262175d26e523caa4f62ded035cb377d3f9c7333b3606278c4691a64fad7569"} Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.067751 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.160076 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b61f9c-934f-47c6-bd8c-7967c8adc48e-secret-volume\") pod \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.160252 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b61f9c-934f-47c6-bd8c-7967c8adc48e-config-volume\") pod \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.160278 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfgrl\" (UniqueName: \"kubernetes.io/projected/28b61f9c-934f-47c6-bd8c-7967c8adc48e-kube-api-access-tfgrl\") pod \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\" (UID: \"28b61f9c-934f-47c6-bd8c-7967c8adc48e\") " Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.160928 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28b61f9c-934f-47c6-bd8c-7967c8adc48e-config-volume" (OuterVolumeSpecName: "config-volume") pod "28b61f9c-934f-47c6-bd8c-7967c8adc48e" (UID: "28b61f9c-934f-47c6-bd8c-7967c8adc48e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.166076 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b61f9c-934f-47c6-bd8c-7967c8adc48e-kube-api-access-tfgrl" (OuterVolumeSpecName: "kube-api-access-tfgrl") pod "28b61f9c-934f-47c6-bd8c-7967c8adc48e" (UID: "28b61f9c-934f-47c6-bd8c-7967c8adc48e"). InnerVolumeSpecName "kube-api-access-tfgrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.166581 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b61f9c-934f-47c6-bd8c-7967c8adc48e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "28b61f9c-934f-47c6-bd8c-7967c8adc48e" (UID: "28b61f9c-934f-47c6-bd8c-7967c8adc48e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.262387 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b61f9c-934f-47c6-bd8c-7967c8adc48e-config-volume\") on node \"crc\" DevicePath \"\"" Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.262427 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfgrl\" (UniqueName: \"kubernetes.io/projected/28b61f9c-934f-47c6-bd8c-7967c8adc48e-kube-api-access-tfgrl\") on node \"crc\" DevicePath \"\"" Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.262441 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b61f9c-934f-47c6-bd8c-7967c8adc48e-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.800185 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" event={"ID":"28b61f9c-934f-47c6-bd8c-7967c8adc48e","Type":"ContainerDied","Data":"8262175d26e523caa4f62ded035cb377d3f9c7333b3606278c4691a64fad7569"} Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.800622 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8262175d26e523caa4f62ded035cb377d3f9c7333b3606278c4691a64fad7569" Jan 25 06:15:03 crc kubenswrapper[4728]: I0125 06:15:03.800304 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488695-sgmlc" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.146778 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 25 06:15:04 crc kubenswrapper[4728]: E0125 06:15:04.147492 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b61f9c-934f-47c6-bd8c-7967c8adc48e" containerName="collect-profiles" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.147508 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b61f9c-934f-47c6-bd8c-7967c8adc48e" containerName="collect-profiles" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.147742 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b61f9c-934f-47c6-bd8c-7967c8adc48e" containerName="collect-profiles" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.148701 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.151506 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.151592 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.152427 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.153304 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n8nfh" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.159270 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs"] Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.166750 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488650-h9sxs"] Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.172919 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.181892 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-config-data\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.182090 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.182127 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.285187 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.285261 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.285315 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.285422 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.285473 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-config-data\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.285553 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.286410 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.286979 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.287029 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4kg\" (UniqueName: \"kubernetes.io/projected/29c48d20-6804-4826-89f8-2b3e00949942-kube-api-access-qd4kg\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.287121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.288153 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-config-data\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.289877 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.389340 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.389506 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.389630 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.389667 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.389741 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4kg\" (UniqueName: \"kubernetes.io/projected/29c48d20-6804-4826-89f8-2b3e00949942-kube-api-access-qd4kg\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.389841 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.390059 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.390152 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.390240 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.394432 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.394539 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.408412 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4kg\" (UniqueName: \"kubernetes.io/projected/29c48d20-6804-4826-89f8-2b3e00949942-kube-api-access-qd4kg\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.415897 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.472047 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 25 06:15:04 crc kubenswrapper[4728]: I0125 06:15:04.883698 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 25 06:15:04 crc kubenswrapper[4728]: W0125 06:15:04.888524 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29c48d20_6804_4826_89f8_2b3e00949942.slice/crio-1d18f496cd9d9c2c40863f34d5d47abf382545fe90ed3692485e973d890e6610 WatchSource:0}: Error finding container 1d18f496cd9d9c2c40863f34d5d47abf382545fe90ed3692485e973d890e6610: Status 404 returned error can't find the container with id 1d18f496cd9d9c2c40863f34d5d47abf382545fe90ed3692485e973d890e6610 Jan 25 06:15:05 crc kubenswrapper[4728]: I0125 06:15:05.348213 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="322a41cf-af1e-4c7e-80e3-b7c7a32e8c68" path="/var/lib/kubelet/pods/322a41cf-af1e-4c7e-80e3-b7c7a32e8c68/volumes" Jan 25 06:15:05 crc kubenswrapper[4728]: I0125 06:15:05.822215 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"29c48d20-6804-4826-89f8-2b3e00949942","Type":"ContainerStarted","Data":"1d18f496cd9d9c2c40863f34d5d47abf382545fe90ed3692485e973d890e6610"} Jan 25 06:15:12 crc kubenswrapper[4728]: I0125 06:15:12.899114 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:15:12 crc kubenswrapper[4728]: I0125 06:15:12.899758 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:15:12 crc kubenswrapper[4728]: I0125 06:15:12.899830 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 06:15:12 crc kubenswrapper[4728]: I0125 06:15:12.900628 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d65eb3ab7b3191ef5a90da016917172abb4b15bea97bca8dcfe0198c9442bb6"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 06:15:12 crc kubenswrapper[4728]: I0125 06:15:12.900696 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://5d65eb3ab7b3191ef5a90da016917172abb4b15bea97bca8dcfe0198c9442bb6" gracePeriod=600 Jan 25 06:15:13 crc kubenswrapper[4728]: I0125 06:15:13.906392 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="5d65eb3ab7b3191ef5a90da016917172abb4b15bea97bca8dcfe0198c9442bb6" exitCode=0 Jan 25 06:15:13 crc kubenswrapper[4728]: I0125 06:15:13.906504 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"5d65eb3ab7b3191ef5a90da016917172abb4b15bea97bca8dcfe0198c9442bb6"} Jan 25 06:15:13 crc kubenswrapper[4728]: I0125 06:15:13.906713 4728 scope.go:117] "RemoveContainer" containerID="006db167bb4f986ad06a0b1dac25aba092fcbbb3667a42607f89bc31b5f37e29" Jan 25 06:15:14 crc kubenswrapper[4728]: I0125 06:15:14.916065 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3"} Jan 25 06:15:19 crc kubenswrapper[4728]: I0125 06:15:19.969698 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"29c48d20-6804-4826-89f8-2b3e00949942","Type":"ContainerStarted","Data":"47b20c0f721975305db1f61e939099e278b2a284c66fd773d2db57e3fffeaf33"} Jan 25 06:15:19 crc kubenswrapper[4728]: I0125 06:15:19.988823 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=2.518477597 podStartE2EDuration="16.988802305s" podCreationTimestamp="2026-01-25 06:15:03 +0000 UTC" firstStartedPulling="2026-01-25 06:15:04.892617222 +0000 UTC m=+2195.928495202" lastFinishedPulling="2026-01-25 06:15:19.362941929 +0000 UTC m=+2210.398819910" observedRunningTime="2026-01-25 06:15:19.982948688 +0000 UTC m=+2211.018826668" watchObservedRunningTime="2026-01-25 06:15:19.988802305 +0000 UTC m=+2211.024680285" Jan 25 06:15:33 crc kubenswrapper[4728]: I0125 06:15:33.451732 4728 scope.go:117] "RemoveContainer" containerID="c9f667c9c53b8d3ea2f9be2f47a6b9a0955d6c9ee583815ebe2bc20d8a86d114" Jan 25 06:17:42 crc kubenswrapper[4728]: I0125 06:17:42.899674 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:17:42 crc kubenswrapper[4728]: I0125 06:17:42.900277 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:18:12 crc kubenswrapper[4728]: I0125 06:18:12.899556 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:18:12 crc kubenswrapper[4728]: I0125 06:18:12.900341 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:18:42 crc kubenswrapper[4728]: I0125 06:18:42.899705 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:18:42 crc kubenswrapper[4728]: I0125 06:18:42.900373 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:18:42 crc kubenswrapper[4728]: I0125 06:18:42.900492 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 06:18:42 crc kubenswrapper[4728]: I0125 06:18:42.901859 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 06:18:42 crc kubenswrapper[4728]: I0125 06:18:42.901947 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" gracePeriod=600 Jan 25 06:18:43 crc kubenswrapper[4728]: E0125 06:18:43.019437 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:18:43 crc kubenswrapper[4728]: I0125 06:18:43.757428 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" exitCode=0 Jan 25 06:18:43 crc kubenswrapper[4728]: I0125 06:18:43.757477 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3"} Jan 25 06:18:43 crc kubenswrapper[4728]: I0125 06:18:43.757524 4728 scope.go:117] "RemoveContainer" containerID="5d65eb3ab7b3191ef5a90da016917172abb4b15bea97bca8dcfe0198c9442bb6" Jan 25 06:18:43 crc kubenswrapper[4728]: I0125 06:18:43.759397 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:18:43 crc kubenswrapper[4728]: E0125 06:18:43.759893 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:18:55 crc kubenswrapper[4728]: I0125 06:18:55.330053 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:18:55 crc kubenswrapper[4728]: E0125 06:18:55.331436 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:19:06 crc kubenswrapper[4728]: I0125 06:19:06.329699 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:19:06 crc kubenswrapper[4728]: E0125 06:19:06.330775 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:19:21 crc kubenswrapper[4728]: I0125 06:19:21.329156 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:19:21 crc kubenswrapper[4728]: E0125 06:19:21.330169 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:19:33 crc kubenswrapper[4728]: I0125 06:19:33.538379 4728 scope.go:117] "RemoveContainer" containerID="e00f974b1dfe4c4e68c7b66d4a506e81cc846d54aae41cc80df900c94daea25e" Jan 25 06:19:33 crc kubenswrapper[4728]: I0125 06:19:33.570120 4728 scope.go:117] "RemoveContainer" containerID="6f946105e350bb40e75e87434bc4347fbd54a0efb9b1edf651c117dab6bfaeb6" Jan 25 06:19:33 crc kubenswrapper[4728]: I0125 06:19:33.596435 4728 scope.go:117] "RemoveContainer" containerID="1a25df4a0250255df4f626707b7677267466d124a50031774a5f339905b03ae2" Jan 25 06:19:35 crc kubenswrapper[4728]: I0125 06:19:35.328986 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:19:35 crc kubenswrapper[4728]: E0125 06:19:35.329522 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:19:46 crc kubenswrapper[4728]: I0125 06:19:46.328840 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:19:46 crc kubenswrapper[4728]: E0125 06:19:46.329718 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:20:01 crc kubenswrapper[4728]: I0125 06:20:01.328940 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:20:01 crc kubenswrapper[4728]: E0125 06:20:01.330921 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:20:12 crc kubenswrapper[4728]: I0125 06:20:12.328865 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:20:12 crc kubenswrapper[4728]: E0125 06:20:12.329638 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:20:26 crc kubenswrapper[4728]: I0125 06:20:26.329269 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:20:26 crc kubenswrapper[4728]: E0125 06:20:26.329914 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:20:37 crc kubenswrapper[4728]: I0125 06:20:37.329808 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:20:37 crc kubenswrapper[4728]: E0125 06:20:37.331530 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:20:51 crc kubenswrapper[4728]: I0125 06:20:51.334463 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:20:51 crc kubenswrapper[4728]: E0125 06:20:51.335204 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:21:05 crc kubenswrapper[4728]: I0125 06:21:05.329003 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:21:05 crc kubenswrapper[4728]: E0125 06:21:05.329868 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:21:20 crc kubenswrapper[4728]: I0125 06:21:20.329501 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:21:20 crc kubenswrapper[4728]: E0125 06:21:20.330572 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:21:34 crc kubenswrapper[4728]: I0125 06:21:34.328492 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:21:34 crc kubenswrapper[4728]: E0125 06:21:34.331189 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:21:46 crc kubenswrapper[4728]: I0125 06:21:46.328821 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:21:46 crc kubenswrapper[4728]: E0125 06:21:46.329714 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:22:01 crc kubenswrapper[4728]: I0125 06:22:01.329435 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:22:01 crc kubenswrapper[4728]: E0125 06:22:01.330911 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:22:15 crc kubenswrapper[4728]: I0125 06:22:15.329053 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:22:15 crc kubenswrapper[4728]: E0125 06:22:15.329965 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:22:27 crc kubenswrapper[4728]: I0125 06:22:27.329060 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:22:27 crc kubenswrapper[4728]: E0125 06:22:27.330034 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:22:40 crc kubenswrapper[4728]: I0125 06:22:40.328645 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:22:40 crc kubenswrapper[4728]: E0125 06:22:40.329451 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:22:53 crc kubenswrapper[4728]: I0125 06:22:53.329460 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:22:53 crc kubenswrapper[4728]: E0125 06:22:53.330413 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:23:06 crc kubenswrapper[4728]: I0125 06:23:06.329135 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:23:06 crc kubenswrapper[4728]: E0125 06:23:06.330112 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.635577 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7hl4f"] Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.637931 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.648984 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7hl4f"] Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.735146 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmxzw\" (UniqueName: \"kubernetes.io/projected/12591860-ed9c-43e6-99a9-bcae715270ee-kube-api-access-qmxzw\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.735216 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-catalog-content\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.735456 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-utilities\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.837089 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-utilities\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.837193 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmxzw\" (UniqueName: \"kubernetes.io/projected/12591860-ed9c-43e6-99a9-bcae715270ee-kube-api-access-qmxzw\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.837248 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-catalog-content\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.837679 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-utilities\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.837739 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-catalog-content\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.856291 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmxzw\" (UniqueName: \"kubernetes.io/projected/12591860-ed9c-43e6-99a9-bcae715270ee-kube-api-access-qmxzw\") pod \"redhat-operators-7hl4f\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:14 crc kubenswrapper[4728]: I0125 06:23:14.956033 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:15 crc kubenswrapper[4728]: I0125 06:23:15.372390 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7hl4f"] Jan 25 06:23:16 crc kubenswrapper[4728]: I0125 06:23:16.020271 4728 generic.go:334] "Generic (PLEG): container finished" podID="12591860-ed9c-43e6-99a9-bcae715270ee" containerID="75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9" exitCode=0 Jan 25 06:23:16 crc kubenswrapper[4728]: I0125 06:23:16.020343 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hl4f" event={"ID":"12591860-ed9c-43e6-99a9-bcae715270ee","Type":"ContainerDied","Data":"75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9"} Jan 25 06:23:16 crc kubenswrapper[4728]: I0125 06:23:16.020381 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hl4f" event={"ID":"12591860-ed9c-43e6-99a9-bcae715270ee","Type":"ContainerStarted","Data":"1cccb29d0ab4ab5477ed19e3f30c84f58232a1e9427e8c9fbddde1151c2ac7ac"} Jan 25 06:23:16 crc kubenswrapper[4728]: I0125 06:23:16.022722 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 06:23:17 crc kubenswrapper[4728]: I0125 06:23:17.035686 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hl4f" event={"ID":"12591860-ed9c-43e6-99a9-bcae715270ee","Type":"ContainerStarted","Data":"8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476"} Jan 25 06:23:19 crc kubenswrapper[4728]: I0125 06:23:19.068836 4728 generic.go:334] "Generic (PLEG): container finished" podID="12591860-ed9c-43e6-99a9-bcae715270ee" containerID="8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476" exitCode=0 Jan 25 06:23:19 crc kubenswrapper[4728]: I0125 06:23:19.068929 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hl4f" event={"ID":"12591860-ed9c-43e6-99a9-bcae715270ee","Type":"ContainerDied","Data":"8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476"} Jan 25 06:23:20 crc kubenswrapper[4728]: I0125 06:23:20.084875 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hl4f" event={"ID":"12591860-ed9c-43e6-99a9-bcae715270ee","Type":"ContainerStarted","Data":"dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d"} Jan 25 06:23:20 crc kubenswrapper[4728]: I0125 06:23:20.107491 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7hl4f" podStartSLOduration=2.567539114 podStartE2EDuration="6.107469086s" podCreationTimestamp="2026-01-25 06:23:14 +0000 UTC" firstStartedPulling="2026-01-25 06:23:16.022443046 +0000 UTC m=+2687.058321026" lastFinishedPulling="2026-01-25 06:23:19.562373018 +0000 UTC m=+2690.598250998" observedRunningTime="2026-01-25 06:23:20.102164254 +0000 UTC m=+2691.138042234" watchObservedRunningTime="2026-01-25 06:23:20.107469086 +0000 UTC m=+2691.143347056" Jan 25 06:23:20 crc kubenswrapper[4728]: I0125 06:23:20.329077 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:23:20 crc kubenswrapper[4728]: E0125 06:23:20.329550 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:23:24 crc kubenswrapper[4728]: I0125 06:23:24.956891 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:24 crc kubenswrapper[4728]: I0125 06:23:24.958196 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:25 crc kubenswrapper[4728]: I0125 06:23:25.992522 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7hl4f" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="registry-server" probeResult="failure" output=< Jan 25 06:23:25 crc kubenswrapper[4728]: timeout: failed to connect service ":50051" within 1s Jan 25 06:23:25 crc kubenswrapper[4728]: > Jan 25 06:23:34 crc kubenswrapper[4728]: I0125 06:23:34.990222 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:35 crc kubenswrapper[4728]: I0125 06:23:35.030406 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:35 crc kubenswrapper[4728]: I0125 06:23:35.219801 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7hl4f"] Jan 25 06:23:35 crc kubenswrapper[4728]: I0125 06:23:35.329713 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:23:35 crc kubenswrapper[4728]: E0125 06:23:35.330063 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.209657 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7hl4f" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="registry-server" containerID="cri-o://dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d" gracePeriod=2 Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.623403 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.729145 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-utilities\") pod \"12591860-ed9c-43e6-99a9-bcae715270ee\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.729258 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmxzw\" (UniqueName: \"kubernetes.io/projected/12591860-ed9c-43e6-99a9-bcae715270ee-kube-api-access-qmxzw\") pod \"12591860-ed9c-43e6-99a9-bcae715270ee\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.729284 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-catalog-content\") pod \"12591860-ed9c-43e6-99a9-bcae715270ee\" (UID: \"12591860-ed9c-43e6-99a9-bcae715270ee\") " Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.729809 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-utilities" (OuterVolumeSpecName: "utilities") pod "12591860-ed9c-43e6-99a9-bcae715270ee" (UID: "12591860-ed9c-43e6-99a9-bcae715270ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.730546 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.735730 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12591860-ed9c-43e6-99a9-bcae715270ee-kube-api-access-qmxzw" (OuterVolumeSpecName: "kube-api-access-qmxzw") pod "12591860-ed9c-43e6-99a9-bcae715270ee" (UID: "12591860-ed9c-43e6-99a9-bcae715270ee"). InnerVolumeSpecName "kube-api-access-qmxzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.813403 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12591860-ed9c-43e6-99a9-bcae715270ee" (UID: "12591860-ed9c-43e6-99a9-bcae715270ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.832443 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmxzw\" (UniqueName: \"kubernetes.io/projected/12591860-ed9c-43e6-99a9-bcae715270ee-kube-api-access-qmxzw\") on node \"crc\" DevicePath \"\"" Jan 25 06:23:36 crc kubenswrapper[4728]: I0125 06:23:36.832467 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12591860-ed9c-43e6-99a9-bcae715270ee-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.242258 4728 generic.go:334] "Generic (PLEG): container finished" podID="12591860-ed9c-43e6-99a9-bcae715270ee" containerID="dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d" exitCode=0 Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.242350 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hl4f" event={"ID":"12591860-ed9c-43e6-99a9-bcae715270ee","Type":"ContainerDied","Data":"dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d"} Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.242354 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hl4f" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.242392 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hl4f" event={"ID":"12591860-ed9c-43e6-99a9-bcae715270ee","Type":"ContainerDied","Data":"1cccb29d0ab4ab5477ed19e3f30c84f58232a1e9427e8c9fbddde1151c2ac7ac"} Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.242432 4728 scope.go:117] "RemoveContainer" containerID="dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.265610 4728 scope.go:117] "RemoveContainer" containerID="8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.275383 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7hl4f"] Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.286646 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7hl4f"] Jan 25 06:23:37 crc kubenswrapper[4728]: E0125 06:23:37.287146 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12591860_ed9c_43e6_99a9_bcae715270ee.slice/crio-1cccb29d0ab4ab5477ed19e3f30c84f58232a1e9427e8c9fbddde1151c2ac7ac\": RecentStats: unable to find data in memory cache]" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.298126 4728 scope.go:117] "RemoveContainer" containerID="75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.319193 4728 scope.go:117] "RemoveContainer" containerID="dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d" Jan 25 06:23:37 crc kubenswrapper[4728]: E0125 06:23:37.320521 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d\": container with ID starting with dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d not found: ID does not exist" containerID="dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.320564 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d"} err="failed to get container status \"dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d\": rpc error: code = NotFound desc = could not find container \"dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d\": container with ID starting with dd9e2c93b3d05ce6e6411b395121dc519eb41f174690ee345e45a72ac22e264d not found: ID does not exist" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.320595 4728 scope.go:117] "RemoveContainer" containerID="8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476" Jan 25 06:23:37 crc kubenswrapper[4728]: E0125 06:23:37.320966 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476\": container with ID starting with 8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476 not found: ID does not exist" containerID="8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.320995 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476"} err="failed to get container status \"8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476\": rpc error: code = NotFound desc = could not find container \"8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476\": container with ID starting with 8a235fbfe311eb5383283060f30b5cc8352f895480d519ac0bd9415dee87e476 not found: ID does not exist" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.321011 4728 scope.go:117] "RemoveContainer" containerID="75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9" Jan 25 06:23:37 crc kubenswrapper[4728]: E0125 06:23:37.322077 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9\": container with ID starting with 75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9 not found: ID does not exist" containerID="75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.322123 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9"} err="failed to get container status \"75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9\": rpc error: code = NotFound desc = could not find container \"75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9\": container with ID starting with 75903a87f50c210fb73ca29ed37f818524434185f319be9b120e4f17b02b99d9 not found: ID does not exist" Jan 25 06:23:37 crc kubenswrapper[4728]: I0125 06:23:37.342296 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" path="/var/lib/kubelet/pods/12591860-ed9c-43e6-99a9-bcae715270ee/volumes" Jan 25 06:23:46 crc kubenswrapper[4728]: I0125 06:23:46.329460 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:23:47 crc kubenswrapper[4728]: I0125 06:23:47.339610 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"0201cd13d5c97ce5e9051352b0acc2751424c1d8d457ff7eabb5281b1fdd90c2"} Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.368164 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4bgxp"] Jan 25 06:23:53 crc kubenswrapper[4728]: E0125 06:23:53.369111 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="extract-utilities" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.369129 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="extract-utilities" Jan 25 06:23:53 crc kubenswrapper[4728]: E0125 06:23:53.369139 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="extract-content" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.369147 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="extract-content" Jan 25 06:23:53 crc kubenswrapper[4728]: E0125 06:23:53.369168 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="registry-server" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.369175 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="registry-server" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.369400 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="12591860-ed9c-43e6-99a9-bcae715270ee" containerName="registry-server" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.371338 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.423799 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bgxp"] Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.469792 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsh8\" (UniqueName: \"kubernetes.io/projected/00d91717-c0a7-4a1a-99cd-04c33a971a1c-kube-api-access-sxsh8\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.470166 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-utilities\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.470280 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-catalog-content\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.572123 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsh8\" (UniqueName: \"kubernetes.io/projected/00d91717-c0a7-4a1a-99cd-04c33a971a1c-kube-api-access-sxsh8\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.572253 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-utilities\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.572281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-catalog-content\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.572704 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-utilities\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.572734 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-catalog-content\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.590558 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsh8\" (UniqueName: \"kubernetes.io/projected/00d91717-c0a7-4a1a-99cd-04c33a971a1c-kube-api-access-sxsh8\") pod \"redhat-marketplace-4bgxp\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:53 crc kubenswrapper[4728]: I0125 06:23:53.688766 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:23:54 crc kubenswrapper[4728]: I0125 06:23:54.129645 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bgxp"] Jan 25 06:23:54 crc kubenswrapper[4728]: I0125 06:23:54.395459 4728 generic.go:334] "Generic (PLEG): container finished" podID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerID="5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf" exitCode=0 Jan 25 06:23:54 crc kubenswrapper[4728]: I0125 06:23:54.395790 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bgxp" event={"ID":"00d91717-c0a7-4a1a-99cd-04c33a971a1c","Type":"ContainerDied","Data":"5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf"} Jan 25 06:23:54 crc kubenswrapper[4728]: I0125 06:23:54.395838 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bgxp" event={"ID":"00d91717-c0a7-4a1a-99cd-04c33a971a1c","Type":"ContainerStarted","Data":"ecaef68ccb1a359a470c2f9a9e8f6e7c226bb9260155b06a70a9093b19fe4474"} Jan 25 06:23:55 crc kubenswrapper[4728]: I0125 06:23:55.402568 4728 generic.go:334] "Generic (PLEG): container finished" podID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerID="6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb" exitCode=0 Jan 25 06:23:55 crc kubenswrapper[4728]: I0125 06:23:55.402659 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bgxp" event={"ID":"00d91717-c0a7-4a1a-99cd-04c33a971a1c","Type":"ContainerDied","Data":"6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb"} Jan 25 06:23:56 crc kubenswrapper[4728]: I0125 06:23:56.415224 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bgxp" event={"ID":"00d91717-c0a7-4a1a-99cd-04c33a971a1c","Type":"ContainerStarted","Data":"06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562"} Jan 25 06:23:56 crc kubenswrapper[4728]: I0125 06:23:56.440014 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4bgxp" podStartSLOduration=2.005602249 podStartE2EDuration="3.439974874s" podCreationTimestamp="2026-01-25 06:23:53 +0000 UTC" firstStartedPulling="2026-01-25 06:23:54.397760601 +0000 UTC m=+2725.433638582" lastFinishedPulling="2026-01-25 06:23:55.832133227 +0000 UTC m=+2726.868011207" observedRunningTime="2026-01-25 06:23:56.432811417 +0000 UTC m=+2727.468689397" watchObservedRunningTime="2026-01-25 06:23:56.439974874 +0000 UTC m=+2727.475852854" Jan 25 06:24:03 crc kubenswrapper[4728]: I0125 06:24:03.689794 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:24:03 crc kubenswrapper[4728]: I0125 06:24:03.690244 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:24:03 crc kubenswrapper[4728]: I0125 06:24:03.725490 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:24:04 crc kubenswrapper[4728]: I0125 06:24:04.513246 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:24:04 crc kubenswrapper[4728]: I0125 06:24:04.555332 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bgxp"] Jan 25 06:24:06 crc kubenswrapper[4728]: I0125 06:24:06.490485 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4bgxp" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerName="registry-server" containerID="cri-o://06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562" gracePeriod=2 Jan 25 06:24:06 crc kubenswrapper[4728]: I0125 06:24:06.893796 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.006164 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-catalog-content\") pod \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.006355 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-utilities\") pod \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.006457 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxsh8\" (UniqueName: \"kubernetes.io/projected/00d91717-c0a7-4a1a-99cd-04c33a971a1c-kube-api-access-sxsh8\") pod \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\" (UID: \"00d91717-c0a7-4a1a-99cd-04c33a971a1c\") " Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.007212 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-utilities" (OuterVolumeSpecName: "utilities") pod "00d91717-c0a7-4a1a-99cd-04c33a971a1c" (UID: "00d91717-c0a7-4a1a-99cd-04c33a971a1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.012019 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d91717-c0a7-4a1a-99cd-04c33a971a1c-kube-api-access-sxsh8" (OuterVolumeSpecName: "kube-api-access-sxsh8") pod "00d91717-c0a7-4a1a-99cd-04c33a971a1c" (UID: "00d91717-c0a7-4a1a-99cd-04c33a971a1c"). InnerVolumeSpecName "kube-api-access-sxsh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.022415 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00d91717-c0a7-4a1a-99cd-04c33a971a1c" (UID: "00d91717-c0a7-4a1a-99cd-04c33a971a1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.109776 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxsh8\" (UniqueName: \"kubernetes.io/projected/00d91717-c0a7-4a1a-99cd-04c33a971a1c-kube-api-access-sxsh8\") on node \"crc\" DevicePath \"\"" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.109806 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.109819 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d91717-c0a7-4a1a-99cd-04c33a971a1c-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.500471 4728 generic.go:334] "Generic (PLEG): container finished" podID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerID="06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562" exitCode=0 Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.500528 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bgxp" event={"ID":"00d91717-c0a7-4a1a-99cd-04c33a971a1c","Type":"ContainerDied","Data":"06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562"} Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.500872 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bgxp" event={"ID":"00d91717-c0a7-4a1a-99cd-04c33a971a1c","Type":"ContainerDied","Data":"ecaef68ccb1a359a470c2f9a9e8f6e7c226bb9260155b06a70a9093b19fe4474"} Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.500894 4728 scope.go:117] "RemoveContainer" containerID="06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.500575 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bgxp" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.523637 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bgxp"] Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.526741 4728 scope.go:117] "RemoveContainer" containerID="6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.530121 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bgxp"] Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.541328 4728 scope.go:117] "RemoveContainer" containerID="5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.577060 4728 scope.go:117] "RemoveContainer" containerID="06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562" Jan 25 06:24:07 crc kubenswrapper[4728]: E0125 06:24:07.577354 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562\": container with ID starting with 06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562 not found: ID does not exist" containerID="06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.577386 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562"} err="failed to get container status \"06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562\": rpc error: code = NotFound desc = could not find container \"06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562\": container with ID starting with 06e0fc54c3cc127a5b7948cc074390db00a37e8e4009a47ce68b416b8aa31562 not found: ID does not exist" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.577408 4728 scope.go:117] "RemoveContainer" containerID="6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb" Jan 25 06:24:07 crc kubenswrapper[4728]: E0125 06:24:07.577680 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb\": container with ID starting with 6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb not found: ID does not exist" containerID="6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.577705 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb"} err="failed to get container status \"6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb\": rpc error: code = NotFound desc = could not find container \"6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb\": container with ID starting with 6c0734e2d6c84361a2ddd0344b8da36d7647f32a2edbde3cb76b1223d1522beb not found: ID does not exist" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.577720 4728 scope.go:117] "RemoveContainer" containerID="5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf" Jan 25 06:24:07 crc kubenswrapper[4728]: E0125 06:24:07.577990 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf\": container with ID starting with 5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf not found: ID does not exist" containerID="5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf" Jan 25 06:24:07 crc kubenswrapper[4728]: I0125 06:24:07.578102 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf"} err="failed to get container status \"5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf\": rpc error: code = NotFound desc = could not find container \"5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf\": container with ID starting with 5ebedd5cea803f41454d6cab3f66243bc1fbc38289c8ea28399067ed031d69bf not found: ID does not exist" Jan 25 06:24:09 crc kubenswrapper[4728]: I0125 06:24:09.339711 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" path="/var/lib/kubelet/pods/00d91717-c0a7-4a1a-99cd-04c33a971a1c/volumes" Jan 25 06:26:12 crc kubenswrapper[4728]: I0125 06:26:12.899549 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:26:12 crc kubenswrapper[4728]: I0125 06:26:12.900232 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:26:42 crc kubenswrapper[4728]: I0125 06:26:42.899292 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:26:42 crc kubenswrapper[4728]: I0125 06:26:42.899961 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:27:12 crc kubenswrapper[4728]: I0125 06:27:12.899193 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:27:12 crc kubenswrapper[4728]: I0125 06:27:12.900088 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:27:12 crc kubenswrapper[4728]: I0125 06:27:12.900174 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 06:27:12 crc kubenswrapper[4728]: I0125 06:27:12.901135 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0201cd13d5c97ce5e9051352b0acc2751424c1d8d457ff7eabb5281b1fdd90c2"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 06:27:12 crc kubenswrapper[4728]: I0125 06:27:12.901222 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://0201cd13d5c97ce5e9051352b0acc2751424c1d8d457ff7eabb5281b1fdd90c2" gracePeriod=600 Jan 25 06:27:13 crc kubenswrapper[4728]: I0125 06:27:13.159182 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="0201cd13d5c97ce5e9051352b0acc2751424c1d8d457ff7eabb5281b1fdd90c2" exitCode=0 Jan 25 06:27:13 crc kubenswrapper[4728]: I0125 06:27:13.159279 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"0201cd13d5c97ce5e9051352b0acc2751424c1d8d457ff7eabb5281b1fdd90c2"} Jan 25 06:27:13 crc kubenswrapper[4728]: I0125 06:27:13.159508 4728 scope.go:117] "RemoveContainer" containerID="2cdee2f883f104c000964f3e8a675d47179413ca667539b34a17d4c3ec793ac3" Jan 25 06:27:14 crc kubenswrapper[4728]: I0125 06:27:14.169449 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c"} Jan 25 06:27:55 crc kubenswrapper[4728]: I0125 06:27:55.855967 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5jcn7"] Jan 25 06:27:55 crc kubenswrapper[4728]: E0125 06:27:55.856679 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerName="extract-content" Jan 25 06:27:55 crc kubenswrapper[4728]: I0125 06:27:55.856691 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerName="extract-content" Jan 25 06:27:55 crc kubenswrapper[4728]: E0125 06:27:55.856722 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerName="extract-utilities" Jan 25 06:27:55 crc kubenswrapper[4728]: I0125 06:27:55.856728 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerName="extract-utilities" Jan 25 06:27:55 crc kubenswrapper[4728]: E0125 06:27:55.856746 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerName="registry-server" Jan 25 06:27:55 crc kubenswrapper[4728]: I0125 06:27:55.856751 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerName="registry-server" Jan 25 06:27:55 crc kubenswrapper[4728]: I0125 06:27:55.856916 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d91717-c0a7-4a1a-99cd-04c33a971a1c" containerName="registry-server" Jan 25 06:27:55 crc kubenswrapper[4728]: I0125 06:27:55.858119 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:55 crc kubenswrapper[4728]: I0125 06:27:55.866428 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jcn7"] Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.027098 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9nx7\" (UniqueName: \"kubernetes.io/projected/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-kube-api-access-g9nx7\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.027183 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-catalog-content\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.027260 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-utilities\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.129060 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-utilities\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.129211 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9nx7\" (UniqueName: \"kubernetes.io/projected/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-kube-api-access-g9nx7\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.129238 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-catalog-content\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.129549 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-utilities\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.129583 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-catalog-content\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.145869 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9nx7\" (UniqueName: \"kubernetes.io/projected/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-kube-api-access-g9nx7\") pod \"certified-operators-5jcn7\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.173256 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:27:56 crc kubenswrapper[4728]: I0125 06:27:56.639396 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jcn7"] Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.454440 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hv4pw"] Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.457490 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.468042 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv4pw"] Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.534212 4728 generic.go:334] "Generic (PLEG): container finished" podID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerID="79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9" exitCode=0 Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.534267 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jcn7" event={"ID":"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a","Type":"ContainerDied","Data":"79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9"} Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.534294 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jcn7" event={"ID":"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a","Type":"ContainerStarted","Data":"34356f61294c032b9aa2588c49397b0237117ff9051393c7558845121309cc96"} Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.556701 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-utilities\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.556934 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67m7r\" (UniqueName: \"kubernetes.io/projected/eb094e19-5f6b-410e-9d7d-62b287f8b57c-kube-api-access-67m7r\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.557030 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-catalog-content\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.658704 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-utilities\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.658754 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67m7r\" (UniqueName: \"kubernetes.io/projected/eb094e19-5f6b-410e-9d7d-62b287f8b57c-kube-api-access-67m7r\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.658842 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-catalog-content\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.659227 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-utilities\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.659555 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-catalog-content\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.683705 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67m7r\" (UniqueName: \"kubernetes.io/projected/eb094e19-5f6b-410e-9d7d-62b287f8b57c-kube-api-access-67m7r\") pod \"community-operators-hv4pw\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:57 crc kubenswrapper[4728]: I0125 06:27:57.774495 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:27:58 crc kubenswrapper[4728]: I0125 06:27:58.271470 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv4pw"] Jan 25 06:27:58 crc kubenswrapper[4728]: I0125 06:27:58.543984 4728 generic.go:334] "Generic (PLEG): container finished" podID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerID="c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9" exitCode=0 Jan 25 06:27:58 crc kubenswrapper[4728]: I0125 06:27:58.544094 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv4pw" event={"ID":"eb094e19-5f6b-410e-9d7d-62b287f8b57c","Type":"ContainerDied","Data":"c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9"} Jan 25 06:27:58 crc kubenswrapper[4728]: I0125 06:27:58.544367 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv4pw" event={"ID":"eb094e19-5f6b-410e-9d7d-62b287f8b57c","Type":"ContainerStarted","Data":"46870d1891f42c7823623abb618641a83319fee5001680d75bfaeb82c7f55810"} Jan 25 06:27:58 crc kubenswrapper[4728]: I0125 06:27:58.547463 4728 generic.go:334] "Generic (PLEG): container finished" podID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerID="c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3" exitCode=0 Jan 25 06:27:58 crc kubenswrapper[4728]: I0125 06:27:58.547507 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jcn7" event={"ID":"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a","Type":"ContainerDied","Data":"c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3"} Jan 25 06:27:59 crc kubenswrapper[4728]: I0125 06:27:59.557523 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jcn7" event={"ID":"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a","Type":"ContainerStarted","Data":"a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745"} Jan 25 06:27:59 crc kubenswrapper[4728]: I0125 06:27:59.559623 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv4pw" event={"ID":"eb094e19-5f6b-410e-9d7d-62b287f8b57c","Type":"ContainerStarted","Data":"fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2"} Jan 25 06:27:59 crc kubenswrapper[4728]: I0125 06:27:59.577929 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5jcn7" podStartSLOduration=3.11943459 podStartE2EDuration="4.577910743s" podCreationTimestamp="2026-01-25 06:27:55 +0000 UTC" firstStartedPulling="2026-01-25 06:27:57.536544901 +0000 UTC m=+2968.572422881" lastFinishedPulling="2026-01-25 06:27:58.995021054 +0000 UTC m=+2970.030899034" observedRunningTime="2026-01-25 06:27:59.572824313 +0000 UTC m=+2970.608702293" watchObservedRunningTime="2026-01-25 06:27:59.577910743 +0000 UTC m=+2970.613788724" Jan 25 06:28:00 crc kubenswrapper[4728]: I0125 06:28:00.571243 4728 generic.go:334] "Generic (PLEG): container finished" podID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerID="fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2" exitCode=0 Jan 25 06:28:00 crc kubenswrapper[4728]: I0125 06:28:00.571375 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv4pw" event={"ID":"eb094e19-5f6b-410e-9d7d-62b287f8b57c","Type":"ContainerDied","Data":"fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2"} Jan 25 06:28:01 crc kubenswrapper[4728]: I0125 06:28:01.611654 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv4pw" event={"ID":"eb094e19-5f6b-410e-9d7d-62b287f8b57c","Type":"ContainerStarted","Data":"d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3"} Jan 25 06:28:01 crc kubenswrapper[4728]: I0125 06:28:01.634365 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hv4pw" podStartSLOduration=2.13476787 podStartE2EDuration="4.634350408s" podCreationTimestamp="2026-01-25 06:27:57 +0000 UTC" firstStartedPulling="2026-01-25 06:27:58.545840586 +0000 UTC m=+2969.581718566" lastFinishedPulling="2026-01-25 06:28:01.045423124 +0000 UTC m=+2972.081301104" observedRunningTime="2026-01-25 06:28:01.626698741 +0000 UTC m=+2972.662576721" watchObservedRunningTime="2026-01-25 06:28:01.634350408 +0000 UTC m=+2972.670228387" Jan 25 06:28:06 crc kubenswrapper[4728]: I0125 06:28:06.174400 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:28:06 crc kubenswrapper[4728]: I0125 06:28:06.175625 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:28:06 crc kubenswrapper[4728]: I0125 06:28:06.214589 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:28:06 crc kubenswrapper[4728]: I0125 06:28:06.686312 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:28:07 crc kubenswrapper[4728]: I0125 06:28:07.774670 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:28:07 crc kubenswrapper[4728]: I0125 06:28:07.774743 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:28:07 crc kubenswrapper[4728]: I0125 06:28:07.810563 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:28:08 crc kubenswrapper[4728]: I0125 06:28:08.705842 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.252124 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jcn7"] Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.252460 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5jcn7" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerName="registry-server" containerID="cri-o://a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745" gracePeriod=2 Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.643682 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.690018 4728 generic.go:334] "Generic (PLEG): container finished" podID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerID="a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745" exitCode=0 Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.690105 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jcn7" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.690130 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jcn7" event={"ID":"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a","Type":"ContainerDied","Data":"a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745"} Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.690225 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jcn7" event={"ID":"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a","Type":"ContainerDied","Data":"34356f61294c032b9aa2588c49397b0237117ff9051393c7558845121309cc96"} Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.690253 4728 scope.go:117] "RemoveContainer" containerID="a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.713659 4728 scope.go:117] "RemoveContainer" containerID="c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.738123 4728 scope.go:117] "RemoveContainer" containerID="79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.786506 4728 scope.go:117] "RemoveContainer" containerID="a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745" Jan 25 06:28:09 crc kubenswrapper[4728]: E0125 06:28:09.787047 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745\": container with ID starting with a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745 not found: ID does not exist" containerID="a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.787092 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745"} err="failed to get container status \"a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745\": rpc error: code = NotFound desc = could not find container \"a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745\": container with ID starting with a00ab271800a0c4f7de36f224e44dbc602fecb735969eee890d5c15060bbc745 not found: ID does not exist" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.787123 4728 scope.go:117] "RemoveContainer" containerID="c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3" Jan 25 06:28:09 crc kubenswrapper[4728]: E0125 06:28:09.787499 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3\": container with ID starting with c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3 not found: ID does not exist" containerID="c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.787541 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3"} err="failed to get container status \"c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3\": rpc error: code = NotFound desc = could not find container \"c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3\": container with ID starting with c9e509d7165199476918ab82988f179ac9232b540f1988129bbe604f9341f1d3 not found: ID does not exist" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.787571 4728 scope.go:117] "RemoveContainer" containerID="79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9" Jan 25 06:28:09 crc kubenswrapper[4728]: E0125 06:28:09.787982 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9\": container with ID starting with 79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9 not found: ID does not exist" containerID="79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.788004 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9"} err="failed to get container status \"79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9\": rpc error: code = NotFound desc = could not find container \"79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9\": container with ID starting with 79d3ec5de8c5da12967cf915669af24676675ea6a5d9ede15eaf3cf53e64e3d9 not found: ID does not exist" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.817192 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-catalog-content\") pod \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.817269 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9nx7\" (UniqueName: \"kubernetes.io/projected/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-kube-api-access-g9nx7\") pod \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.818011 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-utilities" (OuterVolumeSpecName: "utilities") pod "7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" (UID: "7b0fbe5e-2ddc-451f-bc93-62cb44c9204a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.817447 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-utilities\") pod \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\" (UID: \"7b0fbe5e-2ddc-451f-bc93-62cb44c9204a\") " Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.819018 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.824967 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-kube-api-access-g9nx7" (OuterVolumeSpecName: "kube-api-access-g9nx7") pod "7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" (UID: "7b0fbe5e-2ddc-451f-bc93-62cb44c9204a"). InnerVolumeSpecName "kube-api-access-g9nx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.850774 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" (UID: "7b0fbe5e-2ddc-451f-bc93-62cb44c9204a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.921464 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:28:09 crc kubenswrapper[4728]: I0125 06:28:09.921756 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9nx7\" (UniqueName: \"kubernetes.io/projected/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a-kube-api-access-g9nx7\") on node \"crc\" DevicePath \"\"" Jan 25 06:28:10 crc kubenswrapper[4728]: I0125 06:28:10.021860 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jcn7"] Jan 25 06:28:10 crc kubenswrapper[4728]: I0125 06:28:10.038200 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5jcn7"] Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.247863 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv4pw"] Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.248352 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hv4pw" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerName="registry-server" containerID="cri-o://d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3" gracePeriod=2 Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.337008 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" path="/var/lib/kubelet/pods/7b0fbe5e-2ddc-451f-bc93-62cb44c9204a/volumes" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.653816 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.720108 4728 generic.go:334] "Generic (PLEG): container finished" podID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerID="d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3" exitCode=0 Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.720161 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv4pw" event={"ID":"eb094e19-5f6b-410e-9d7d-62b287f8b57c","Type":"ContainerDied","Data":"d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3"} Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.720173 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv4pw" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.720195 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv4pw" event={"ID":"eb094e19-5f6b-410e-9d7d-62b287f8b57c","Type":"ContainerDied","Data":"46870d1891f42c7823623abb618641a83319fee5001680d75bfaeb82c7f55810"} Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.720219 4728 scope.go:117] "RemoveContainer" containerID="d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.736572 4728 scope.go:117] "RemoveContainer" containerID="fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.752820 4728 scope.go:117] "RemoveContainer" containerID="c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.759660 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67m7r\" (UniqueName: \"kubernetes.io/projected/eb094e19-5f6b-410e-9d7d-62b287f8b57c-kube-api-access-67m7r\") pod \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.759834 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-catalog-content\") pod \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.760046 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-utilities\") pod \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\" (UID: \"eb094e19-5f6b-410e-9d7d-62b287f8b57c\") " Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.760661 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-utilities" (OuterVolumeSpecName: "utilities") pod "eb094e19-5f6b-410e-9d7d-62b287f8b57c" (UID: "eb094e19-5f6b-410e-9d7d-62b287f8b57c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.760961 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.765068 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb094e19-5f6b-410e-9d7d-62b287f8b57c-kube-api-access-67m7r" (OuterVolumeSpecName: "kube-api-access-67m7r") pod "eb094e19-5f6b-410e-9d7d-62b287f8b57c" (UID: "eb094e19-5f6b-410e-9d7d-62b287f8b57c"). InnerVolumeSpecName "kube-api-access-67m7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.800261 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb094e19-5f6b-410e-9d7d-62b287f8b57c" (UID: "eb094e19-5f6b-410e-9d7d-62b287f8b57c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.820147 4728 scope.go:117] "RemoveContainer" containerID="d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3" Jan 25 06:28:11 crc kubenswrapper[4728]: E0125 06:28:11.820784 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3\": container with ID starting with d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3 not found: ID does not exist" containerID="d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.820832 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3"} err="failed to get container status \"d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3\": rpc error: code = NotFound desc = could not find container \"d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3\": container with ID starting with d56384fd61f8614b446ac77a044c6ee77d1f12e84c6fec1117726b1a439b7df3 not found: ID does not exist" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.820863 4728 scope.go:117] "RemoveContainer" containerID="fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2" Jan 25 06:28:11 crc kubenswrapper[4728]: E0125 06:28:11.821280 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2\": container with ID starting with fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2 not found: ID does not exist" containerID="fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.821339 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2"} err="failed to get container status \"fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2\": rpc error: code = NotFound desc = could not find container \"fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2\": container with ID starting with fab558170ca19795e19963c4a4433a9bafa717653f91160cb4ac594ad6b757f2 not found: ID does not exist" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.821378 4728 scope.go:117] "RemoveContainer" containerID="c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9" Jan 25 06:28:11 crc kubenswrapper[4728]: E0125 06:28:11.821736 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9\": container with ID starting with c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9 not found: ID does not exist" containerID="c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.821771 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9"} err="failed to get container status \"c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9\": rpc error: code = NotFound desc = could not find container \"c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9\": container with ID starting with c7a9e1900626df20635bf639ade12d530dbf617ed4dce0f4b056c1c35353bad9 not found: ID does not exist" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.863470 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67m7r\" (UniqueName: \"kubernetes.io/projected/eb094e19-5f6b-410e-9d7d-62b287f8b57c-kube-api-access-67m7r\") on node \"crc\" DevicePath \"\"" Jan 25 06:28:11 crc kubenswrapper[4728]: I0125 06:28:11.863495 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb094e19-5f6b-410e-9d7d-62b287f8b57c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:28:12 crc kubenswrapper[4728]: I0125 06:28:12.052966 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv4pw"] Jan 25 06:28:12 crc kubenswrapper[4728]: I0125 06:28:12.061203 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hv4pw"] Jan 25 06:28:13 crc kubenswrapper[4728]: I0125 06:28:13.341007 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" path="/var/lib/kubelet/pods/eb094e19-5f6b-410e-9d7d-62b287f8b57c/volumes" Jan 25 06:29:42 crc kubenswrapper[4728]: I0125 06:29:42.898898 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:29:42 crc kubenswrapper[4728]: I0125 06:29:42.899773 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.145113 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6"] Jan 25 06:30:00 crc kubenswrapper[4728]: E0125 06:30:00.145931 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerName="extract-content" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.145944 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerName="extract-content" Jan 25 06:30:00 crc kubenswrapper[4728]: E0125 06:30:00.145958 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerName="extract-content" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.145964 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerName="extract-content" Jan 25 06:30:00 crc kubenswrapper[4728]: E0125 06:30:00.145985 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerName="extract-utilities" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.145991 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerName="extract-utilities" Jan 25 06:30:00 crc kubenswrapper[4728]: E0125 06:30:00.146015 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerName="extract-utilities" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.146020 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerName="extract-utilities" Jan 25 06:30:00 crc kubenswrapper[4728]: E0125 06:30:00.146026 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerName="registry-server" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.146031 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerName="registry-server" Jan 25 06:30:00 crc kubenswrapper[4728]: E0125 06:30:00.146040 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerName="registry-server" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.146045 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerName="registry-server" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.146196 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0fbe5e-2ddc-451f-bc93-62cb44c9204a" containerName="registry-server" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.146213 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb094e19-5f6b-410e-9d7d-62b287f8b57c" containerName="registry-server" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.146790 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.152732 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.152779 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.160828 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6"] Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.221861 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-secret-volume\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.221909 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndtx\" (UniqueName: \"kubernetes.io/projected/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-kube-api-access-xndtx\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.221930 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-config-volume\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.323238 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-secret-volume\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.323282 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-config-volume\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.323300 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xndtx\" (UniqueName: \"kubernetes.io/projected/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-kube-api-access-xndtx\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.324559 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-config-volume\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.330063 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-secret-volume\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.338874 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndtx\" (UniqueName: \"kubernetes.io/projected/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-kube-api-access-xndtx\") pod \"collect-profiles-29488710-wfvm6\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.464301 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:00 crc kubenswrapper[4728]: I0125 06:30:00.867465 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6"] Jan 25 06:30:01 crc kubenswrapper[4728]: I0125 06:30:01.678056 4728 generic.go:334] "Generic (PLEG): container finished" podID="9e382cf8-a8ec-44d5-931b-3ff458f4bb41" containerID="99d041ef051412475929ef773a76258a2282a50f34962e54ecbf6d2679e2dcb3" exitCode=0 Jan 25 06:30:01 crc kubenswrapper[4728]: I0125 06:30:01.678145 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" event={"ID":"9e382cf8-a8ec-44d5-931b-3ff458f4bb41","Type":"ContainerDied","Data":"99d041ef051412475929ef773a76258a2282a50f34962e54ecbf6d2679e2dcb3"} Jan 25 06:30:01 crc kubenswrapper[4728]: I0125 06:30:01.678430 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" event={"ID":"9e382cf8-a8ec-44d5-931b-3ff458f4bb41","Type":"ContainerStarted","Data":"610ec92ff7acb49b0a50d12be47c44010cf6df9718146c56e43c129ab984d00d"} Jan 25 06:30:02 crc kubenswrapper[4728]: I0125 06:30:02.978836 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.084560 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-config-volume\") pod \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.084622 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-secret-volume\") pod \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.084684 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xndtx\" (UniqueName: \"kubernetes.io/projected/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-kube-api-access-xndtx\") pod \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\" (UID: \"9e382cf8-a8ec-44d5-931b-3ff458f4bb41\") " Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.085262 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-config-volume" (OuterVolumeSpecName: "config-volume") pod "9e382cf8-a8ec-44d5-931b-3ff458f4bb41" (UID: "9e382cf8-a8ec-44d5-931b-3ff458f4bb41"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.085911 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-config-volume\") on node \"crc\" DevicePath \"\"" Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.091106 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-kube-api-access-xndtx" (OuterVolumeSpecName: "kube-api-access-xndtx") pod "9e382cf8-a8ec-44d5-931b-3ff458f4bb41" (UID: "9e382cf8-a8ec-44d5-931b-3ff458f4bb41"). InnerVolumeSpecName "kube-api-access-xndtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.092368 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9e382cf8-a8ec-44d5-931b-3ff458f4bb41" (UID: "9e382cf8-a8ec-44d5-931b-3ff458f4bb41"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.188253 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.188296 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xndtx\" (UniqueName: \"kubernetes.io/projected/9e382cf8-a8ec-44d5-931b-3ff458f4bb41-kube-api-access-xndtx\") on node \"crc\" DevicePath \"\"" Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.699674 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" event={"ID":"9e382cf8-a8ec-44d5-931b-3ff458f4bb41","Type":"ContainerDied","Data":"610ec92ff7acb49b0a50d12be47c44010cf6df9718146c56e43c129ab984d00d"} Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.699728 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="610ec92ff7acb49b0a50d12be47c44010cf6df9718146c56e43c129ab984d00d" Jan 25 06:30:03 crc kubenswrapper[4728]: I0125 06:30:03.699798 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488710-wfvm6" Jan 25 06:30:04 crc kubenswrapper[4728]: I0125 06:30:04.046935 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz"] Jan 25 06:30:04 crc kubenswrapper[4728]: I0125 06:30:04.052039 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488665-blqfz"] Jan 25 06:30:05 crc kubenswrapper[4728]: I0125 06:30:05.342361 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="022c3734-c432-4ba6-9e9b-19fcedd5db9c" path="/var/lib/kubelet/pods/022c3734-c432-4ba6-9e9b-19fcedd5db9c/volumes" Jan 25 06:30:12 crc kubenswrapper[4728]: I0125 06:30:12.899279 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:30:12 crc kubenswrapper[4728]: I0125 06:30:12.900007 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:30:33 crc kubenswrapper[4728]: I0125 06:30:33.854287 4728 scope.go:117] "RemoveContainer" containerID="bd855b185c4d35baade53a1731370fdcca7a90d2b6960f6fe567f6dcc901311d" Jan 25 06:30:42 crc kubenswrapper[4728]: I0125 06:30:42.899203 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:30:42 crc kubenswrapper[4728]: I0125 06:30:42.899798 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:30:42 crc kubenswrapper[4728]: I0125 06:30:42.899855 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 06:30:42 crc kubenswrapper[4728]: I0125 06:30:42.900463 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 06:30:42 crc kubenswrapper[4728]: I0125 06:30:42.900518 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" gracePeriod=600 Jan 25 06:30:43 crc kubenswrapper[4728]: E0125 06:30:43.015626 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:30:43 crc kubenswrapper[4728]: I0125 06:30:43.037984 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" exitCode=0 Jan 25 06:30:43 crc kubenswrapper[4728]: I0125 06:30:43.038029 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c"} Jan 25 06:30:43 crc kubenswrapper[4728]: I0125 06:30:43.038081 4728 scope.go:117] "RemoveContainer" containerID="0201cd13d5c97ce5e9051352b0acc2751424c1d8d457ff7eabb5281b1fdd90c2" Jan 25 06:30:43 crc kubenswrapper[4728]: I0125 06:30:43.038867 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:30:43 crc kubenswrapper[4728]: E0125 06:30:43.039239 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:30:53 crc kubenswrapper[4728]: I0125 06:30:53.329701 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:30:53 crc kubenswrapper[4728]: E0125 06:30:53.330766 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:31:07 crc kubenswrapper[4728]: I0125 06:31:07.329528 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:31:07 crc kubenswrapper[4728]: E0125 06:31:07.330189 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:31:20 crc kubenswrapper[4728]: I0125 06:31:20.330034 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:31:20 crc kubenswrapper[4728]: E0125 06:31:20.331994 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:31:28 crc kubenswrapper[4728]: I0125 06:31:28.415236 4728 generic.go:334] "Generic (PLEG): container finished" podID="29c48d20-6804-4826-89f8-2b3e00949942" containerID="47b20c0f721975305db1f61e939099e278b2a284c66fd773d2db57e3fffeaf33" exitCode=0 Jan 25 06:31:28 crc kubenswrapper[4728]: I0125 06:31:28.415459 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"29c48d20-6804-4826-89f8-2b3e00949942","Type":"ContainerDied","Data":"47b20c0f721975305db1f61e939099e278b2a284c66fd773d2db57e3fffeaf33"} Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.777806 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.881468 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-temporary\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.881522 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-config-data\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882066 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd4kg\" (UniqueName: \"kubernetes.io/projected/29c48d20-6804-4826-89f8-2b3e00949942-kube-api-access-qd4kg\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882092 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882104 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ssh-key\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882187 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882301 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config-secret\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882447 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ca-certs\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882509 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-workdir\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882556 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config\") pod \"29c48d20-6804-4826-89f8-2b3e00949942\" (UID: \"29c48d20-6804-4826-89f8-2b3e00949942\") " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.882300 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-config-data" (OuterVolumeSpecName: "config-data") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.885138 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-config-data\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.885426 4728 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.888063 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.888124 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.888377 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c48d20-6804-4826-89f8-2b3e00949942-kube-api-access-qd4kg" (OuterVolumeSpecName: "kube-api-access-qd4kg") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "kube-api-access-qd4kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.905763 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.907501 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.907607 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.923529 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "29c48d20-6804-4826-89f8-2b3e00949942" (UID: "29c48d20-6804-4826-89f8-2b3e00949942"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.987040 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd4kg\" (UniqueName: \"kubernetes.io/projected/29c48d20-6804-4826-89f8-2b3e00949942-kube-api-access-qd4kg\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.987065 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.987093 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.987103 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.987111 4728 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/29c48d20-6804-4826-89f8-2b3e00949942-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.987119 4728 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/29c48d20-6804-4826-89f8-2b3e00949942-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:29 crc kubenswrapper[4728]: I0125 06:31:29.987130 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/29c48d20-6804-4826-89f8-2b3e00949942-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:30 crc kubenswrapper[4728]: I0125 06:31:30.002719 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 25 06:31:30 crc kubenswrapper[4728]: I0125 06:31:30.090540 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 25 06:31:30 crc kubenswrapper[4728]: I0125 06:31:30.443212 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"29c48d20-6804-4826-89f8-2b3e00949942","Type":"ContainerDied","Data":"1d18f496cd9d9c2c40863f34d5d47abf382545fe90ed3692485e973d890e6610"} Jan 25 06:31:30 crc kubenswrapper[4728]: I0125 06:31:30.443263 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 25 06:31:30 crc kubenswrapper[4728]: I0125 06:31:30.443263 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d18f496cd9d9c2c40863f34d5d47abf382545fe90ed3692485e973d890e6610" Jan 25 06:31:33 crc kubenswrapper[4728]: I0125 06:31:33.328894 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:31:33 crc kubenswrapper[4728]: E0125 06:31:33.329486 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.143735 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 25 06:31:39 crc kubenswrapper[4728]: E0125 06:31:39.144549 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e382cf8-a8ec-44d5-931b-3ff458f4bb41" containerName="collect-profiles" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.144565 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e382cf8-a8ec-44d5-931b-3ff458f4bb41" containerName="collect-profiles" Jan 25 06:31:39 crc kubenswrapper[4728]: E0125 06:31:39.144585 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c48d20-6804-4826-89f8-2b3e00949942" containerName="tempest-tests-tempest-tests-runner" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.144591 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c48d20-6804-4826-89f8-2b3e00949942" containerName="tempest-tests-tempest-tests-runner" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.144816 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e382cf8-a8ec-44d5-931b-3ff458f4bb41" containerName="collect-profiles" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.144830 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c48d20-6804-4826-89f8-2b3e00949942" containerName="tempest-tests-tempest-tests-runner" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.145457 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.147941 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n8nfh" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.152757 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.266338 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx7t7\" (UniqueName: \"kubernetes.io/projected/a9a9ae9b-52ea-4cc4-b645-f74946df2a17-kube-api-access-zx7t7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9a9ae9b-52ea-4cc4-b645-f74946df2a17\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.266555 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9a9ae9b-52ea-4cc4-b645-f74946df2a17\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.368463 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9a9ae9b-52ea-4cc4-b645-f74946df2a17\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.368566 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx7t7\" (UniqueName: \"kubernetes.io/projected/a9a9ae9b-52ea-4cc4-b645-f74946df2a17-kube-api-access-zx7t7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9a9ae9b-52ea-4cc4-b645-f74946df2a17\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.369244 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9a9ae9b-52ea-4cc4-b645-f74946df2a17\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.399272 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx7t7\" (UniqueName: \"kubernetes.io/projected/a9a9ae9b-52ea-4cc4-b645-f74946df2a17-kube-api-access-zx7t7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9a9ae9b-52ea-4cc4-b645-f74946df2a17\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.411896 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9a9ae9b-52ea-4cc4-b645-f74946df2a17\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.461861 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.834567 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 25 06:31:39 crc kubenswrapper[4728]: I0125 06:31:39.837553 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 06:31:40 crc kubenswrapper[4728]: I0125 06:31:40.540753 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a9a9ae9b-52ea-4cc4-b645-f74946df2a17","Type":"ContainerStarted","Data":"9f5a43f05584d8c6e97c4e9274c9d0f4377a5ee5d1ef47d04ce61778f7dc6d16"} Jan 25 06:31:41 crc kubenswrapper[4728]: I0125 06:31:41.550198 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a9a9ae9b-52ea-4cc4-b645-f74946df2a17","Type":"ContainerStarted","Data":"453c330969c9f8e9ba8cb28462ffefc344a7eabe7508907a5a539c8ffc9dd857"} Jan 25 06:31:41 crc kubenswrapper[4728]: I0125 06:31:41.568192 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.601636521 podStartE2EDuration="2.568178898s" podCreationTimestamp="2026-01-25 06:31:39 +0000 UTC" firstStartedPulling="2026-01-25 06:31:39.837294273 +0000 UTC m=+3190.873172253" lastFinishedPulling="2026-01-25 06:31:40.80383665 +0000 UTC m=+3191.839714630" observedRunningTime="2026-01-25 06:31:41.563593412 +0000 UTC m=+3192.599471393" watchObservedRunningTime="2026-01-25 06:31:41.568178898 +0000 UTC m=+3192.604056879" Jan 25 06:31:45 crc kubenswrapper[4728]: I0125 06:31:45.329566 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:31:45 crc kubenswrapper[4728]: E0125 06:31:45.330659 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:31:57 crc kubenswrapper[4728]: I0125 06:31:57.329757 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:31:57 crc kubenswrapper[4728]: E0125 06:31:57.330541 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.495603 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2mzgl/must-gather-xhx78"] Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.497591 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.499241 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2mzgl"/"kube-root-ca.crt" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.499279 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2mzgl"/"openshift-service-ca.crt" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.509419 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2mzgl/must-gather-xhx78"] Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.651381 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/167b706d-7d8c-46ae-a5ad-14c70b7f6948-must-gather-output\") pod \"must-gather-xhx78\" (UID: \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\") " pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.651572 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhwgr\" (UniqueName: \"kubernetes.io/projected/167b706d-7d8c-46ae-a5ad-14c70b7f6948-kube-api-access-nhwgr\") pod \"must-gather-xhx78\" (UID: \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\") " pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.754527 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhwgr\" (UniqueName: \"kubernetes.io/projected/167b706d-7d8c-46ae-a5ad-14c70b7f6948-kube-api-access-nhwgr\") pod \"must-gather-xhx78\" (UID: \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\") " pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.754714 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/167b706d-7d8c-46ae-a5ad-14c70b7f6948-must-gather-output\") pod \"must-gather-xhx78\" (UID: \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\") " pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.755154 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/167b706d-7d8c-46ae-a5ad-14c70b7f6948-must-gather-output\") pod \"must-gather-xhx78\" (UID: \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\") " pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.772950 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhwgr\" (UniqueName: \"kubernetes.io/projected/167b706d-7d8c-46ae-a5ad-14c70b7f6948-kube-api-access-nhwgr\") pod \"must-gather-xhx78\" (UID: \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\") " pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:31:59 crc kubenswrapper[4728]: I0125 06:31:59.813454 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:32:00 crc kubenswrapper[4728]: I0125 06:32:00.222393 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2mzgl/must-gather-xhx78"] Jan 25 06:32:00 crc kubenswrapper[4728]: W0125 06:32:00.225579 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod167b706d_7d8c_46ae_a5ad_14c70b7f6948.slice/crio-eeeaaa632f826815fe87c99ca78c8c31023418ea1fe498c4e408e001f6fb7b57 WatchSource:0}: Error finding container eeeaaa632f826815fe87c99ca78c8c31023418ea1fe498c4e408e001f6fb7b57: Status 404 returned error can't find the container with id eeeaaa632f826815fe87c99ca78c8c31023418ea1fe498c4e408e001f6fb7b57 Jan 25 06:32:00 crc kubenswrapper[4728]: I0125 06:32:00.720835 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/must-gather-xhx78" event={"ID":"167b706d-7d8c-46ae-a5ad-14c70b7f6948","Type":"ContainerStarted","Data":"eeeaaa632f826815fe87c99ca78c8c31023418ea1fe498c4e408e001f6fb7b57"} Jan 25 06:32:05 crc kubenswrapper[4728]: I0125 06:32:05.768475 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/must-gather-xhx78" event={"ID":"167b706d-7d8c-46ae-a5ad-14c70b7f6948","Type":"ContainerStarted","Data":"5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213"} Jan 25 06:32:05 crc kubenswrapper[4728]: I0125 06:32:05.769105 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/must-gather-xhx78" event={"ID":"167b706d-7d8c-46ae-a5ad-14c70b7f6948","Type":"ContainerStarted","Data":"c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05"} Jan 25 06:32:05 crc kubenswrapper[4728]: I0125 06:32:05.789504 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2mzgl/must-gather-xhx78" podStartSLOduration=1.815806104 podStartE2EDuration="6.789492968s" podCreationTimestamp="2026-01-25 06:31:59 +0000 UTC" firstStartedPulling="2026-01-25 06:32:00.228125452 +0000 UTC m=+3211.264003432" lastFinishedPulling="2026-01-25 06:32:05.201812317 +0000 UTC m=+3216.237690296" observedRunningTime="2026-01-25 06:32:05.78048285 +0000 UTC m=+3216.816360830" watchObservedRunningTime="2026-01-25 06:32:05.789492968 +0000 UTC m=+3216.825370948" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.329179 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:32:08 crc kubenswrapper[4728]: E0125 06:32:08.330763 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.412427 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2mzgl/crc-debug-9xhwb"] Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.413702 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.417077 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2mzgl"/"default-dockercfg-lpclx" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.534823 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mjtw\" (UniqueName: \"kubernetes.io/projected/45860509-79cc-4066-b853-d27cd76bf9d0-kube-api-access-2mjtw\") pod \"crc-debug-9xhwb\" (UID: \"45860509-79cc-4066-b853-d27cd76bf9d0\") " pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.534939 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45860509-79cc-4066-b853-d27cd76bf9d0-host\") pod \"crc-debug-9xhwb\" (UID: \"45860509-79cc-4066-b853-d27cd76bf9d0\") " pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.637247 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mjtw\" (UniqueName: \"kubernetes.io/projected/45860509-79cc-4066-b853-d27cd76bf9d0-kube-api-access-2mjtw\") pod \"crc-debug-9xhwb\" (UID: \"45860509-79cc-4066-b853-d27cd76bf9d0\") " pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.637802 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45860509-79cc-4066-b853-d27cd76bf9d0-host\") pod \"crc-debug-9xhwb\" (UID: \"45860509-79cc-4066-b853-d27cd76bf9d0\") " pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.637927 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45860509-79cc-4066-b853-d27cd76bf9d0-host\") pod \"crc-debug-9xhwb\" (UID: \"45860509-79cc-4066-b853-d27cd76bf9d0\") " pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.656520 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mjtw\" (UniqueName: \"kubernetes.io/projected/45860509-79cc-4066-b853-d27cd76bf9d0-kube-api-access-2mjtw\") pod \"crc-debug-9xhwb\" (UID: \"45860509-79cc-4066-b853-d27cd76bf9d0\") " pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.741822 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:08 crc kubenswrapper[4728]: W0125 06:32:08.775707 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45860509_79cc_4066_b853_d27cd76bf9d0.slice/crio-3107991368ae0b0455c42cd56f9791c28cb2e9de2bb4f7f7b2d6a68b5b64cc5c WatchSource:0}: Error finding container 3107991368ae0b0455c42cd56f9791c28cb2e9de2bb4f7f7b2d6a68b5b64cc5c: Status 404 returned error can't find the container with id 3107991368ae0b0455c42cd56f9791c28cb2e9de2bb4f7f7b2d6a68b5b64cc5c Jan 25 06:32:08 crc kubenswrapper[4728]: I0125 06:32:08.794509 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" event={"ID":"45860509-79cc-4066-b853-d27cd76bf9d0","Type":"ContainerStarted","Data":"3107991368ae0b0455c42cd56f9791c28cb2e9de2bb4f7f7b2d6a68b5b64cc5c"} Jan 25 06:32:19 crc kubenswrapper[4728]: I0125 06:32:19.889394 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" event={"ID":"45860509-79cc-4066-b853-d27cd76bf9d0","Type":"ContainerStarted","Data":"c207dc5bed29f51e230dd0d27a218136d54c4c51063464beb50a63c742064e23"} Jan 25 06:32:19 crc kubenswrapper[4728]: I0125 06:32:19.912740 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" podStartSLOduration=1.096933873 podStartE2EDuration="11.912720733s" podCreationTimestamp="2026-01-25 06:32:08 +0000 UTC" firstStartedPulling="2026-01-25 06:32:08.777947564 +0000 UTC m=+3219.813825544" lastFinishedPulling="2026-01-25 06:32:19.593734434 +0000 UTC m=+3230.629612404" observedRunningTime="2026-01-25 06:32:19.912567554 +0000 UTC m=+3230.948445534" watchObservedRunningTime="2026-01-25 06:32:19.912720733 +0000 UTC m=+3230.948598713" Jan 25 06:32:22 crc kubenswrapper[4728]: I0125 06:32:22.329533 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:32:22 crc kubenswrapper[4728]: E0125 06:32:22.330510 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:32:31 crc kubenswrapper[4728]: I0125 06:32:31.983252 4728 generic.go:334] "Generic (PLEG): container finished" podID="45860509-79cc-4066-b853-d27cd76bf9d0" containerID="c207dc5bed29f51e230dd0d27a218136d54c4c51063464beb50a63c742064e23" exitCode=0 Jan 25 06:32:31 crc kubenswrapper[4728]: I0125 06:32:31.983350 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" event={"ID":"45860509-79cc-4066-b853-d27cd76bf9d0","Type":"ContainerDied","Data":"c207dc5bed29f51e230dd0d27a218136d54c4c51063464beb50a63c742064e23"} Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.096077 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.137821 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2mzgl/crc-debug-9xhwb"] Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.148107 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2mzgl/crc-debug-9xhwb"] Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.261129 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45860509-79cc-4066-b853-d27cd76bf9d0-host\") pod \"45860509-79cc-4066-b853-d27cd76bf9d0\" (UID: \"45860509-79cc-4066-b853-d27cd76bf9d0\") " Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.261465 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mjtw\" (UniqueName: \"kubernetes.io/projected/45860509-79cc-4066-b853-d27cd76bf9d0-kube-api-access-2mjtw\") pod \"45860509-79cc-4066-b853-d27cd76bf9d0\" (UID: \"45860509-79cc-4066-b853-d27cd76bf9d0\") " Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.261501 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45860509-79cc-4066-b853-d27cd76bf9d0-host" (OuterVolumeSpecName: "host") pod "45860509-79cc-4066-b853-d27cd76bf9d0" (UID: "45860509-79cc-4066-b853-d27cd76bf9d0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.282496 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45860509-79cc-4066-b853-d27cd76bf9d0-kube-api-access-2mjtw" (OuterVolumeSpecName: "kube-api-access-2mjtw") pod "45860509-79cc-4066-b853-d27cd76bf9d0" (UID: "45860509-79cc-4066-b853-d27cd76bf9d0"). InnerVolumeSpecName "kube-api-access-2mjtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.338283 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45860509-79cc-4066-b853-d27cd76bf9d0" path="/var/lib/kubelet/pods/45860509-79cc-4066-b853-d27cd76bf9d0/volumes" Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.364783 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mjtw\" (UniqueName: \"kubernetes.io/projected/45860509-79cc-4066-b853-d27cd76bf9d0-kube-api-access-2mjtw\") on node \"crc\" DevicePath \"\"" Jan 25 06:32:33 crc kubenswrapper[4728]: I0125 06:32:33.364886 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45860509-79cc-4066-b853-d27cd76bf9d0-host\") on node \"crc\" DevicePath \"\"" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:33.999769 4728 scope.go:117] "RemoveContainer" containerID="c207dc5bed29f51e230dd0d27a218136d54c4c51063464beb50a63c742064e23" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:33.999806 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/crc-debug-9xhwb" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.300312 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2mzgl/crc-debug-cnmf9"] Jan 25 06:32:34 crc kubenswrapper[4728]: E0125 06:32:34.301776 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45860509-79cc-4066-b853-d27cd76bf9d0" containerName="container-00" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.301870 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="45860509-79cc-4066-b853-d27cd76bf9d0" containerName="container-00" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.302259 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="45860509-79cc-4066-b853-d27cd76bf9d0" containerName="container-00" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.303292 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.305379 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2mzgl"/"default-dockercfg-lpclx" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.328762 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:32:34 crc kubenswrapper[4728]: E0125 06:32:34.329229 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.384767 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-host\") pod \"crc-debug-cnmf9\" (UID: \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\") " pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.384821 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-725pb\" (UniqueName: \"kubernetes.io/projected/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-kube-api-access-725pb\") pod \"crc-debug-cnmf9\" (UID: \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\") " pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.487741 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-host\") pod \"crc-debug-cnmf9\" (UID: \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\") " pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.487812 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-725pb\" (UniqueName: \"kubernetes.io/projected/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-kube-api-access-725pb\") pod \"crc-debug-cnmf9\" (UID: \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\") " pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.488156 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-host\") pod \"crc-debug-cnmf9\" (UID: \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\") " pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.504751 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-725pb\" (UniqueName: \"kubernetes.io/projected/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-kube-api-access-725pb\") pod \"crc-debug-cnmf9\" (UID: \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\") " pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:34 crc kubenswrapper[4728]: I0125 06:32:34.618042 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:34 crc kubenswrapper[4728]: W0125 06:32:34.640212 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod414a1bca_8137_470d_9a7f_5cd95f7d7f8c.slice/crio-88b0c4b2c8162e2f8bea3b4d9cd195a353631c90f8b7c333b3de905f2a3077bc WatchSource:0}: Error finding container 88b0c4b2c8162e2f8bea3b4d9cd195a353631c90f8b7c333b3de905f2a3077bc: Status 404 returned error can't find the container with id 88b0c4b2c8162e2f8bea3b4d9cd195a353631c90f8b7c333b3de905f2a3077bc Jan 25 06:32:35 crc kubenswrapper[4728]: I0125 06:32:35.011031 4728 generic.go:334] "Generic (PLEG): container finished" podID="414a1bca-8137-470d-9a7f-5cd95f7d7f8c" containerID="dcb5e605b226b29b91f7d866c98c602cc6b09d814f5032c669a493a7c05b9c70" exitCode=1 Jan 25 06:32:35 crc kubenswrapper[4728]: I0125 06:32:35.011305 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" event={"ID":"414a1bca-8137-470d-9a7f-5cd95f7d7f8c","Type":"ContainerDied","Data":"dcb5e605b226b29b91f7d866c98c602cc6b09d814f5032c669a493a7c05b9c70"} Jan 25 06:32:35 crc kubenswrapper[4728]: I0125 06:32:35.011452 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" event={"ID":"414a1bca-8137-470d-9a7f-5cd95f7d7f8c","Type":"ContainerStarted","Data":"88b0c4b2c8162e2f8bea3b4d9cd195a353631c90f8b7c333b3de905f2a3077bc"} Jan 25 06:32:35 crc kubenswrapper[4728]: I0125 06:32:35.052415 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2mzgl/crc-debug-cnmf9"] Jan 25 06:32:35 crc kubenswrapper[4728]: I0125 06:32:35.060935 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2mzgl/crc-debug-cnmf9"] Jan 25 06:32:36 crc kubenswrapper[4728]: I0125 06:32:36.102801 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:36 crc kubenswrapper[4728]: I0125 06:32:36.229404 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-host\") pod \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\" (UID: \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\") " Jan 25 06:32:36 crc kubenswrapper[4728]: I0125 06:32:36.229503 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-725pb\" (UniqueName: \"kubernetes.io/projected/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-kube-api-access-725pb\") pod \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\" (UID: \"414a1bca-8137-470d-9a7f-5cd95f7d7f8c\") " Jan 25 06:32:36 crc kubenswrapper[4728]: I0125 06:32:36.229794 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-host" (OuterVolumeSpecName: "host") pod "414a1bca-8137-470d-9a7f-5cd95f7d7f8c" (UID: "414a1bca-8137-470d-9a7f-5cd95f7d7f8c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 06:32:36 crc kubenswrapper[4728]: I0125 06:32:36.230949 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-host\") on node \"crc\" DevicePath \"\"" Jan 25 06:32:36 crc kubenswrapper[4728]: I0125 06:32:36.235556 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-kube-api-access-725pb" (OuterVolumeSpecName: "kube-api-access-725pb") pod "414a1bca-8137-470d-9a7f-5cd95f7d7f8c" (UID: "414a1bca-8137-470d-9a7f-5cd95f7d7f8c"). InnerVolumeSpecName "kube-api-access-725pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:32:36 crc kubenswrapper[4728]: I0125 06:32:36.333438 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-725pb\" (UniqueName: \"kubernetes.io/projected/414a1bca-8137-470d-9a7f-5cd95f7d7f8c-kube-api-access-725pb\") on node \"crc\" DevicePath \"\"" Jan 25 06:32:37 crc kubenswrapper[4728]: I0125 06:32:37.038092 4728 scope.go:117] "RemoveContainer" containerID="dcb5e605b226b29b91f7d866c98c602cc6b09d814f5032c669a493a7c05b9c70" Jan 25 06:32:37 crc kubenswrapper[4728]: I0125 06:32:37.038374 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/crc-debug-cnmf9" Jan 25 06:32:37 crc kubenswrapper[4728]: I0125 06:32:37.340867 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414a1bca-8137-470d-9a7f-5cd95f7d7f8c" path="/var/lib/kubelet/pods/414a1bca-8137-470d-9a7f-5cd95f7d7f8c/volumes" Jan 25 06:32:48 crc kubenswrapper[4728]: I0125 06:32:48.330002 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:32:48 crc kubenswrapper[4728]: E0125 06:32:48.331241 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:32:59 crc kubenswrapper[4728]: I0125 06:32:59.334257 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:32:59 crc kubenswrapper[4728]: E0125 06:32:59.334996 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:33:05 crc kubenswrapper[4728]: I0125 06:33:05.611536 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b8466696b-psmv9_36477670-fd4c-4015-8fab-b7608c72a906/barbican-api/0.log" Jan 25 06:33:05 crc kubenswrapper[4728]: I0125 06:33:05.699718 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b8466696b-psmv9_36477670-fd4c-4015-8fab-b7608c72a906/barbican-api-log/0.log" Jan 25 06:33:05 crc kubenswrapper[4728]: I0125 06:33:05.780781 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b587b754-6vhmj_4564c893-fa22-47c0-92b9-4d503b3553ee/barbican-keystone-listener/0.log" Jan 25 06:33:05 crc kubenswrapper[4728]: I0125 06:33:05.781980 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b587b754-6vhmj_4564c893-fa22-47c0-92b9-4d503b3553ee/barbican-keystone-listener-log/0.log" Jan 25 06:33:05 crc kubenswrapper[4728]: I0125 06:33:05.902358 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b49f9f9b7-2ld95_fd2e0efe-0434-4103-a08d-d014f69addf6/barbican-worker/0.log" Jan 25 06:33:05 crc kubenswrapper[4728]: I0125 06:33:05.917374 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b49f9f9b7-2ld95_fd2e0efe-0434-4103-a08d-d014f69addf6/barbican-worker-log/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.062097 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd_18badfbd-fe91-4d6e-8ecd-765ed6994030/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.117264 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_05cf94b4-4884-4e05-9036-3676fb8aedcb/ceilometer-central-agent/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.165053 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_05cf94b4-4884-4e05-9036-3676fb8aedcb/ceilometer-notification-agent/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.253330 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_05cf94b4-4884-4e05-9036-3676fb8aedcb/sg-core/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.278563 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_05cf94b4-4884-4e05-9036-3676fb8aedcb/proxy-httpd/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.436368 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b9e0ee7f-ce38-4e4f-afe0-993551ae84a8/cinder-api-log/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.485001 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e6ab997e-b648-4c08-9ba9-4166b43ebde2/cinder-scheduler/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.510425 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b9e0ee7f-ce38-4e4f-afe0-993551ae84a8/cinder-api/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.632970 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e6ab997e-b648-4c08-9ba9-4166b43ebde2/probe/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.696848 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts_022da454-0c7e-4950-9147-f13a2f725b47/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.826183 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs_c70ce086-a2f9-4979-b292-aa69dc5f9bc3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.847668 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cf5bfd7f-hhq75_2e967299-2864-48a8-ba27-7d2a63f66c43/init/0.log" Jan 25 06:33:06 crc kubenswrapper[4728]: I0125 06:33:06.996535 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cf5bfd7f-hhq75_2e967299-2864-48a8-ba27-7d2a63f66c43/init/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.037204 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-665xk_abcaa620-a9bf-4edf-a044-ea75ca9fa872/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.053711 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cf5bfd7f-hhq75_2e967299-2864-48a8-ba27-7d2a63f66c43/dnsmasq-dns/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.174303 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8fbf2f2e-5205-4c3d-8b05-185404930c85/glance-log/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.192803 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8fbf2f2e-5205-4c3d-8b05-185404930c85/glance-httpd/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.489396 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_036ddc84-2b06-4817-9afd-537d8ed82150/glance-log/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.499953 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_036ddc84-2b06-4817-9afd-537d8ed82150/glance-httpd/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.510772 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-49vxd_c7e1052e-86fc-4070-be06-23fef77216d8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.671975 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qx5hk_4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:07 crc kubenswrapper[4728]: I0125 06:33:07.883737 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29488681-j86cs_6197fee2-5bbd-4edd-bcb5-c10f476f4f83/keystone-cron/0.log" Jan 25 06:33:08 crc kubenswrapper[4728]: I0125 06:33:08.019753 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-64dbb5f568-n5f5j_eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581/keystone-api/0.log" Jan 25 06:33:08 crc kubenswrapper[4728]: I0125 06:33:08.062900 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c45b9d32-afe0-490e-876d-64a9359773ff/kube-state-metrics/0.log" Jan 25 06:33:08 crc kubenswrapper[4728]: I0125 06:33:08.133270 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd_99aea3d5-d496-457b-87b9-95c444db3c76/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:08 crc kubenswrapper[4728]: I0125 06:33:08.409208 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54985bc57c-7dmw7_0d93e327-c397-427e-abe4-0065144bcb7a/neutron-api/0.log" Jan 25 06:33:08 crc kubenswrapper[4728]: I0125 06:33:08.447429 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5162db63-7667-482e-a9bd-174365a318cc/memcached/0.log" Jan 25 06:33:08 crc kubenswrapper[4728]: I0125 06:33:08.489628 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54985bc57c-7dmw7_0d93e327-c397-427e-abe4-0065144bcb7a/neutron-httpd/0.log" Jan 25 06:33:08 crc kubenswrapper[4728]: I0125 06:33:08.607782 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb_6f8bee4e-2d23-4efc-81cc-e82bb4466eb4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.020129 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_34d58c30-a0dd-40da-94b2-ab3cba2038ad/nova-cell0-conductor-conductor/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.071939 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bf4fc010-98c5-4734-a9c9-3de4f1d1a34b/nova-api-log/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.135507 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8ae38c3f-b058-49a8-8df8-5222dc364151/nova-cell1-conductor-conductor/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.194980 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bf4fc010-98c5-4734-a9c9-3de4f1d1a34b/nova-api-api/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.251572 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_99b7a342-4f5c-4977-b189-b0e4cf975704/nova-cell1-novncproxy-novncproxy/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.353260 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-prsds_6b1b4c44-e390-4a48-aa8a-84f5509ef99e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.391135 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_56aa160e-7328-465e-8908-a78bb2fc8364/nova-metadata-log/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.646062 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_086fe2f2-83d2-440c-bcde-f3d1bf8f21c8/nova-scheduler-scheduler/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.659299 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_847abeb1-1f82-44cc-a876-2b8787688696/mysql-bootstrap/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.861270 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_847abeb1-1f82-44cc-a876-2b8787688696/mysql-bootstrap/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.886712 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_847abeb1-1f82-44cc-a876-2b8787688696/galera/0.log" Jan 25 06:33:09 crc kubenswrapper[4728]: I0125 06:33:09.970178 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04585afa-3da7-4da9-896a-2acc02ff910e/mysql-bootstrap/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.144261 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_56aa160e-7328-465e-8908-a78bb2fc8364/nova-metadata-metadata/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.172590 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04585afa-3da7-4da9-896a-2acc02ff910e/mysql-bootstrap/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.193473 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04585afa-3da7-4da9-896a-2acc02ff910e/galera/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.196883 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_190d5aab-6cb5-4373-8e88-74ff4f94ca0e/openstackclient/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.371041 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6x4kp_193b75ba-c337-4422-88ce-aace97ac7638/ovn-controller/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.383205 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n9vwb_37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5/openstack-network-exporter/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.525518 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mr9hh_8a1328da-8d1f-4f1e-9f8b-d61559200740/ovsdb-server-init/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.680583 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mr9hh_8a1328da-8d1f-4f1e-9f8b-d61559200740/ovsdb-server-init/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.695459 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mr9hh_8a1328da-8d1f-4f1e-9f8b-d61559200740/ovsdb-server/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.709124 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mr9hh_8a1328da-8d1f-4f1e-9f8b-d61559200740/ovs-vswitchd/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.717601 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-tbq4s_81f0bf26-935c-4d92-aa59-8c2e22b87f2f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.881859 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe5324ca-4693-4d57-84e1-b2facac597bc/openstack-network-exporter/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.912194 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_52f6d754-4e79-4f05-9986-4abde93d34f0/openstack-network-exporter/0.log" Jan 25 06:33:10 crc kubenswrapper[4728]: I0125 06:33:10.915050 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe5324ca-4693-4d57-84e1-b2facac597bc/ovn-northd/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.026894 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_52f6d754-4e79-4f05-9986-4abde93d34f0/ovsdbserver-nb/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.069152 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ef293374-2620-4494-8bcf-6410e8a53342/openstack-network-exporter/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.087011 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ef293374-2620-4494-8bcf-6410e8a53342/ovsdbserver-sb/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.238590 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f74fbc68-hj87v_e537ee66-7c17-4eb1-a0ce-262f4c260d16/placement-api/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.255401 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_718dab40-f0af-4030-8a9c-2a3a10aa4737/setup-container/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.261528 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f74fbc68-hj87v_e537ee66-7c17-4eb1-a0ce-262f4c260d16/placement-log/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.399874 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_718dab40-f0af-4030-8a9c-2a3a10aa4737/setup-container/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.422568 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_718dab40-f0af-4030-8a9c-2a3a10aa4737/rabbitmq/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.462771 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4a9b861c-f271-4b2b-865e-925bf405c7d1/setup-container/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.589410 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4a9b861c-f271-4b2b-865e-925bf405c7d1/setup-container/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.598635 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4a9b861c-f271-4b2b-865e-925bf405c7d1/rabbitmq/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.610812 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl_291a401e-d560-4e70-b979-57f86593a3b3/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.743587 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fdxpj_bee454bd-9662-4de3-ad06-204eaa3d2709/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.759516 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5_6c8ec845-8142-4a8d-95de-59cd6d159155/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.846747 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-shzgd_568f0cc6-4228-4797-b3af-aa2e43b30c83/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:11 crc kubenswrapper[4728]: I0125 06:33:11.912432 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-kxvkl_631e7d55-5830-4e5c-9ca0-65029b5b30af/ssh-known-hosts-edpm-deployment/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.021882 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b9459545f-6l97s_2a98851c-d86d-423f-a11c-a36fc78633a8/proxy-server/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.039657 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b9459545f-6l97s_2a98851c-d86d-423f-a11c-a36fc78633a8/proxy-httpd/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.085286 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-575nj_747ed3cf-861f-46d7-8411-3c3318fbff34/swift-ring-rebalance/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.206126 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/account-auditor/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.228661 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/account-reaper/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.282209 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/account-replicator/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.310808 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/account-server/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.363618 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/container-auditor/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.385420 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/container-replicator/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.399914 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/container-server/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.463437 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/container-updater/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.474082 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-auditor/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.545198 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-replicator/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.557699 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-expirer/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.565658 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-server/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.619892 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-updater/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.624546 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/rsync/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.691084 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/swift-recon-cron/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.765121 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bw6st_46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.801237 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_29c48d20-6804-4826-89f8-2b3e00949942/tempest-tests-tempest-tests-runner/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.934565 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a9a9ae9b-52ea-4cc4-b645-f74946df2a17/test-operator-logs-container/0.log" Jan 25 06:33:12 crc kubenswrapper[4728]: I0125 06:33:12.968796 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf_11ed206c-89ec-40be-ad2e-6217031ce033/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:33:14 crc kubenswrapper[4728]: I0125 06:33:14.328872 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:33:14 crc kubenswrapper[4728]: E0125 06:33:14.329249 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:33:27 crc kubenswrapper[4728]: I0125 06:33:27.330145 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:33:27 crc kubenswrapper[4728]: E0125 06:33:27.331461 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:33:30 crc kubenswrapper[4728]: I0125 06:33:30.866209 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/util/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.018615 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/util/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.036303 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/pull/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.061009 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/pull/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.211823 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/pull/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.217449 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/util/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.241649 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/extract/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.387289 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7f86f8796f-ztxz6_c906591a-0a65-447e-a795-aa7fb38c64bb/manager/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.444192 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7478f7dbf9-qgwgl_f171accd-8da2-4cf6-a195-536365fbeceb/manager/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.549169 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b45d7bf98-9gflx_13aad2b9-2318-480d-990b-e0627fa9b671/manager/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.682145 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78fdd796fd-f2dft_dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6/manager/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.700128 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-pk7js_788a3e7e-9822-4c83-a7b8-0673f1dcbf6d/manager/0.log" Jan 25 06:33:31 crc kubenswrapper[4728]: I0125 06:33:31.799699 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-nkvhs_3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.022699 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-598f7747c9-p2f2m_694de73b-9b23-4f4c-a54d-bdd806df4e20/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.106430 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-694cf4f878-5dxmw_ad255789-2727-45f9-a389-fee59b5a141a/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.172633 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b8b6d4659-tzts2_700f794a-9dd3-4cea-bdb4-0f17e7faa246/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.276791 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78c6999f6f-ck9lm_5ac48f0b-9ef8-427e-b07c-2318e909b080/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.369635 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6b9fb5fdcb-62h6z_733ecefc-22d8-4a52-9540-09b4aac018e1/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.481435 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78d58447c5-b65wc_9db3c1e7-c92f-40d3-8ff9-ef86e1376688/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.599366 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7bdb645866-8kptt_5f8b10f8-34e5-4250-ade1-7d47b008a4d6/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.670550 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4cd88d46-982pl_663cb342-06a8-4ee4-8e1b-6b2658e1781f/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.763471 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-848957f4b4kndnb_3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b/manager/0.log" Jan 25 06:33:32 crc kubenswrapper[4728]: I0125 06:33:32.952183 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-f6799c556-rtg62_2154b4ed-e610-40ed-8f77-cff0cf57d3a7/operator/0.log" Jan 25 06:33:33 crc kubenswrapper[4728]: I0125 06:33:33.124390 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wtmq9_f0c09b75-dc3c-4fa8-b310-d95a41ba1564/registry-server/0.log" Jan 25 06:33:33 crc kubenswrapper[4728]: I0125 06:33:33.256207 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f75f45d54-h4hvm_e0975e48-db18-44dc-99d7-524b381ad58d/manager/0.log" Jan 25 06:33:33 crc kubenswrapper[4728]: I0125 06:33:33.392968 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-79d5ccc684-mknzf_8e366d4f-b864-47e2-a289-19f97f76a38a/manager/0.log" Jan 25 06:33:33 crc kubenswrapper[4728]: I0125 06:33:33.646423 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dsqrk_cf80aae2-133f-475d-900a-13e8f1dec9ea/operator/0.log" Jan 25 06:33:33 crc kubenswrapper[4728]: I0125 06:33:33.715161 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-8hkv9_e13158ce-126d-4980-9fbd-e7ed492ee879/manager/0.log" Jan 25 06:33:33 crc kubenswrapper[4728]: I0125 06:33:33.893860 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65d46cfd44-wsx7f_585498aa-6031-43a2-ab1a-f52d1bef52e7/manager/0.log" Jan 25 06:33:33 crc kubenswrapper[4728]: I0125 06:33:33.937218 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-85cd9769bb-r6mh4_94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4/manager/0.log" Jan 25 06:33:34 crc kubenswrapper[4728]: I0125 06:33:34.030128 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-jhgs5_4e555f00-c133-4b06-b5df-005238b0541d/manager/0.log" Jan 25 06:33:34 crc kubenswrapper[4728]: I0125 06:33:34.106422 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-5lhdq_76e4e202-a355-4666-8e84-96486d73174c/manager/0.log" Jan 25 06:33:39 crc kubenswrapper[4728]: I0125 06:33:39.338735 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:33:39 crc kubenswrapper[4728]: E0125 06:33:39.343676 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:33:49 crc kubenswrapper[4728]: I0125 06:33:49.994015 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pb5ln_096f04e7-5491-45f6-9290-0a5bd7b7df49/control-plane-machine-set-operator/0.log" Jan 25 06:33:50 crc kubenswrapper[4728]: I0125 06:33:50.157437 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cz94k_652ad7d4-fb59-48ec-936b-305fa0b0966e/kube-rbac-proxy/0.log" Jan 25 06:33:50 crc kubenswrapper[4728]: I0125 06:33:50.182696 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cz94k_652ad7d4-fb59-48ec-936b-305fa0b0966e/machine-api-operator/0.log" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.128702 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kqdd8"] Jan 25 06:33:51 crc kubenswrapper[4728]: E0125 06:33:51.129045 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414a1bca-8137-470d-9a7f-5cd95f7d7f8c" containerName="container-00" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.129058 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="414a1bca-8137-470d-9a7f-5cd95f7d7f8c" containerName="container-00" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.129255 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="414a1bca-8137-470d-9a7f-5cd95f7d7f8c" containerName="container-00" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.134879 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.146248 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqdd8"] Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.230379 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-utilities\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.230824 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxngc\" (UniqueName: \"kubernetes.io/projected/0b8133ed-9012-4518-bad4-74d305601b3d-kube-api-access-cxngc\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.230863 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-catalog-content\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.333022 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxngc\" (UniqueName: \"kubernetes.io/projected/0b8133ed-9012-4518-bad4-74d305601b3d-kube-api-access-cxngc\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.333062 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-catalog-content\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.333212 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-utilities\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.333631 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-utilities\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.333973 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-catalog-content\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.352264 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxngc\" (UniqueName: \"kubernetes.io/projected/0b8133ed-9012-4518-bad4-74d305601b3d-kube-api-access-cxngc\") pod \"redhat-operators-kqdd8\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.457427 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:33:51 crc kubenswrapper[4728]: I0125 06:33:51.888602 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqdd8"] Jan 25 06:33:52 crc kubenswrapper[4728]: I0125 06:33:52.649805 4728 generic.go:334] "Generic (PLEG): container finished" podID="0b8133ed-9012-4518-bad4-74d305601b3d" containerID="c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b" exitCode=0 Jan 25 06:33:52 crc kubenswrapper[4728]: I0125 06:33:52.649874 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqdd8" event={"ID":"0b8133ed-9012-4518-bad4-74d305601b3d","Type":"ContainerDied","Data":"c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b"} Jan 25 06:33:52 crc kubenswrapper[4728]: I0125 06:33:52.650139 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqdd8" event={"ID":"0b8133ed-9012-4518-bad4-74d305601b3d","Type":"ContainerStarted","Data":"827cfd9269e3b87e074c052f3f562b5413b454e318c003450269c6c547b8d0fa"} Jan 25 06:33:53 crc kubenswrapper[4728]: I0125 06:33:53.329109 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:33:53 crc kubenswrapper[4728]: E0125 06:33:53.330056 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:33:53 crc kubenswrapper[4728]: I0125 06:33:53.661293 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqdd8" event={"ID":"0b8133ed-9012-4518-bad4-74d305601b3d","Type":"ContainerStarted","Data":"c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a"} Jan 25 06:33:54 crc kubenswrapper[4728]: I0125 06:33:54.673866 4728 generic.go:334] "Generic (PLEG): container finished" podID="0b8133ed-9012-4518-bad4-74d305601b3d" containerID="c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a" exitCode=0 Jan 25 06:33:54 crc kubenswrapper[4728]: I0125 06:33:54.673939 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqdd8" event={"ID":"0b8133ed-9012-4518-bad4-74d305601b3d","Type":"ContainerDied","Data":"c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a"} Jan 25 06:33:55 crc kubenswrapper[4728]: I0125 06:33:55.688625 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqdd8" event={"ID":"0b8133ed-9012-4518-bad4-74d305601b3d","Type":"ContainerStarted","Data":"632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7"} Jan 25 06:33:55 crc kubenswrapper[4728]: I0125 06:33:55.705293 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kqdd8" podStartSLOduration=2.217735359 podStartE2EDuration="4.705268106s" podCreationTimestamp="2026-01-25 06:33:51 +0000 UTC" firstStartedPulling="2026-01-25 06:33:52.651162116 +0000 UTC m=+3323.687040096" lastFinishedPulling="2026-01-25 06:33:55.138694862 +0000 UTC m=+3326.174572843" observedRunningTime="2026-01-25 06:33:55.702397535 +0000 UTC m=+3326.738275515" watchObservedRunningTime="2026-01-25 06:33:55.705268106 +0000 UTC m=+3326.741146087" Jan 25 06:34:01 crc kubenswrapper[4728]: I0125 06:34:01.386992 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-c5nfr_8060ef0d-4977-4b40-a26c-bded7ccbe72e/cert-manager-controller/0.log" Jan 25 06:34:01 crc kubenswrapper[4728]: I0125 06:34:01.441892 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-98wvg_c89a03e4-cc67-408c-93f8-7c0972ac36a8/cert-manager-cainjector/0.log" Jan 25 06:34:01 crc kubenswrapper[4728]: I0125 06:34:01.458437 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:34:01 crc kubenswrapper[4728]: I0125 06:34:01.459645 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:34:01 crc kubenswrapper[4728]: I0125 06:34:01.495452 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:34:01 crc kubenswrapper[4728]: I0125 06:34:01.546342 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-mm5zm_e7eaed33-a3a8-45fd-b1be-9bec59f65967/cert-manager-webhook/0.log" Jan 25 06:34:01 crc kubenswrapper[4728]: I0125 06:34:01.777691 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:34:01 crc kubenswrapper[4728]: I0125 06:34:01.827138 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kqdd8"] Jan 25 06:34:03 crc kubenswrapper[4728]: I0125 06:34:03.756917 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kqdd8" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" containerName="registry-server" containerID="cri-o://632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7" gracePeriod=2 Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.163137 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.320830 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-utilities\") pod \"0b8133ed-9012-4518-bad4-74d305601b3d\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.321086 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-catalog-content\") pod \"0b8133ed-9012-4518-bad4-74d305601b3d\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.321396 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxngc\" (UniqueName: \"kubernetes.io/projected/0b8133ed-9012-4518-bad4-74d305601b3d-kube-api-access-cxngc\") pod \"0b8133ed-9012-4518-bad4-74d305601b3d\" (UID: \"0b8133ed-9012-4518-bad4-74d305601b3d\") " Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.321556 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-utilities" (OuterVolumeSpecName: "utilities") pod "0b8133ed-9012-4518-bad4-74d305601b3d" (UID: "0b8133ed-9012-4518-bad4-74d305601b3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.322424 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.328583 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8133ed-9012-4518-bad4-74d305601b3d-kube-api-access-cxngc" (OuterVolumeSpecName: "kube-api-access-cxngc") pod "0b8133ed-9012-4518-bad4-74d305601b3d" (UID: "0b8133ed-9012-4518-bad4-74d305601b3d"). InnerVolumeSpecName "kube-api-access-cxngc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.413268 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b8133ed-9012-4518-bad4-74d305601b3d" (UID: "0b8133ed-9012-4518-bad4-74d305601b3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.424893 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxngc\" (UniqueName: \"kubernetes.io/projected/0b8133ed-9012-4518-bad4-74d305601b3d-kube-api-access-cxngc\") on node \"crc\" DevicePath \"\"" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.424923 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8133ed-9012-4518-bad4-74d305601b3d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.768937 4728 generic.go:334] "Generic (PLEG): container finished" podID="0b8133ed-9012-4518-bad4-74d305601b3d" containerID="632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7" exitCode=0 Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.769006 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqdd8" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.769052 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqdd8" event={"ID":"0b8133ed-9012-4518-bad4-74d305601b3d","Type":"ContainerDied","Data":"632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7"} Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.769389 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqdd8" event={"ID":"0b8133ed-9012-4518-bad4-74d305601b3d","Type":"ContainerDied","Data":"827cfd9269e3b87e074c052f3f562b5413b454e318c003450269c6c547b8d0fa"} Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.769421 4728 scope.go:117] "RemoveContainer" containerID="632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.790094 4728 scope.go:117] "RemoveContainer" containerID="c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.805007 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kqdd8"] Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.812406 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kqdd8"] Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.828912 4728 scope.go:117] "RemoveContainer" containerID="c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.847502 4728 scope.go:117] "RemoveContainer" containerID="632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7" Jan 25 06:34:04 crc kubenswrapper[4728]: E0125 06:34:04.847859 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7\": container with ID starting with 632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7 not found: ID does not exist" containerID="632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.847917 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7"} err="failed to get container status \"632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7\": rpc error: code = NotFound desc = could not find container \"632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7\": container with ID starting with 632a8bc6611c228e9ee6c832228a13d9e6a6c73ca3da63bfc5213c4b87299cc7 not found: ID does not exist" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.847950 4728 scope.go:117] "RemoveContainer" containerID="c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a" Jan 25 06:34:04 crc kubenswrapper[4728]: E0125 06:34:04.848296 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a\": container with ID starting with c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a not found: ID does not exist" containerID="c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.848337 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a"} err="failed to get container status \"c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a\": rpc error: code = NotFound desc = could not find container \"c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a\": container with ID starting with c8dfbdc6357f091482d0044567716e2f20f6147d8801cf983be6867dac48001a not found: ID does not exist" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.848359 4728 scope.go:117] "RemoveContainer" containerID="c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b" Jan 25 06:34:04 crc kubenswrapper[4728]: E0125 06:34:04.848597 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b\": container with ID starting with c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b not found: ID does not exist" containerID="c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b" Jan 25 06:34:04 crc kubenswrapper[4728]: I0125 06:34:04.848621 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b"} err="failed to get container status \"c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b\": rpc error: code = NotFound desc = could not find container \"c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b\": container with ID starting with c89e8b4c68bbccf8ce9449645b3c075a1c1b9b2740f30eabaf0952c5ad9a780b not found: ID does not exist" Jan 25 06:34:05 crc kubenswrapper[4728]: I0125 06:34:05.338166 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" path="/var/lib/kubelet/pods/0b8133ed-9012-4518-bad4-74d305601b3d/volumes" Jan 25 06:34:07 crc kubenswrapper[4728]: I0125 06:34:07.329247 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:34:07 crc kubenswrapper[4728]: E0125 06:34:07.329886 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.046544 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-97nrd"] Jan 25 06:34:12 crc kubenswrapper[4728]: E0125 06:34:12.047398 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" containerName="extract-utilities" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.047412 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" containerName="extract-utilities" Jan 25 06:34:12 crc kubenswrapper[4728]: E0125 06:34:12.047428 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" containerName="registry-server" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.047434 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" containerName="registry-server" Jan 25 06:34:12 crc kubenswrapper[4728]: E0125 06:34:12.047443 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" containerName="extract-content" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.047448 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" containerName="extract-content" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.047624 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8133ed-9012-4518-bad4-74d305601b3d" containerName="registry-server" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.048789 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.058122 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nrd"] Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.124190 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-2s8cq_a5ea342d-9a20-4776-80b3-0132cefb2983/nmstate-console-plugin/0.log" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.208670 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-catalog-content\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.208721 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7nb\" (UniqueName: \"kubernetes.io/projected/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-kube-api-access-lq7nb\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.208810 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-utilities\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.311223 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-catalog-content\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.311281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7nb\" (UniqueName: \"kubernetes.io/projected/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-kube-api-access-lq7nb\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.311391 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-utilities\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.311788 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-catalog-content\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.311833 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-utilities\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.336213 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7nb\" (UniqueName: \"kubernetes.io/projected/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-kube-api-access-lq7nb\") pod \"redhat-marketplace-97nrd\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.341835 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6qpdd_417875b7-d358-4db4-ad01-1e31c98e4955/nmstate-handler/0.log" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.361668 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-zzzcv_90dfd7eb-b907-4cc3-95c5-69d9cb694372/kube-rbac-proxy/0.log" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.377698 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.506470 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-zzzcv_90dfd7eb-b907-4cc3-95c5-69d9cb694372/nmstate-metrics/0.log" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.574435 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-cr8sf_8a8de132-1d94-4947-bc4e-0968643f10e0/nmstate-operator/0.log" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.698648 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-528j4_67c7adf0-d43f-47b5-8997-4d691eee4e4f/nmstate-webhook/0.log" Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.809971 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nrd"] Jan 25 06:34:12 crc kubenswrapper[4728]: I0125 06:34:12.850418 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nrd" event={"ID":"71e36fa8-a765-4da9-b96d-8909ff1dfdc0","Type":"ContainerStarted","Data":"0f491b6f1954034593b9fc0bf3967e669a19b2e3f8fce7c54bf2cfe7f2860c85"} Jan 25 06:34:13 crc kubenswrapper[4728]: I0125 06:34:13.859819 4728 generic.go:334] "Generic (PLEG): container finished" podID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerID="d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839" exitCode=0 Jan 25 06:34:13 crc kubenswrapper[4728]: I0125 06:34:13.859916 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nrd" event={"ID":"71e36fa8-a765-4da9-b96d-8909ff1dfdc0","Type":"ContainerDied","Data":"d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839"} Jan 25 06:34:14 crc kubenswrapper[4728]: I0125 06:34:14.869387 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nrd" event={"ID":"71e36fa8-a765-4da9-b96d-8909ff1dfdc0","Type":"ContainerStarted","Data":"fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed"} Jan 25 06:34:15 crc kubenswrapper[4728]: I0125 06:34:15.888833 4728 generic.go:334] "Generic (PLEG): container finished" podID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerID="fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed" exitCode=0 Jan 25 06:34:15 crc kubenswrapper[4728]: I0125 06:34:15.889104 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nrd" event={"ID":"71e36fa8-a765-4da9-b96d-8909ff1dfdc0","Type":"ContainerDied","Data":"fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed"} Jan 25 06:34:16 crc kubenswrapper[4728]: I0125 06:34:16.899217 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nrd" event={"ID":"71e36fa8-a765-4da9-b96d-8909ff1dfdc0","Type":"ContainerStarted","Data":"64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6"} Jan 25 06:34:16 crc kubenswrapper[4728]: I0125 06:34:16.915347 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-97nrd" podStartSLOduration=2.419100664 podStartE2EDuration="4.915311772s" podCreationTimestamp="2026-01-25 06:34:12 +0000 UTC" firstStartedPulling="2026-01-25 06:34:13.861710031 +0000 UTC m=+3344.897588012" lastFinishedPulling="2026-01-25 06:34:16.35792114 +0000 UTC m=+3347.393799120" observedRunningTime="2026-01-25 06:34:16.911267956 +0000 UTC m=+3347.947145937" watchObservedRunningTime="2026-01-25 06:34:16.915311772 +0000 UTC m=+3347.951189751" Jan 25 06:34:21 crc kubenswrapper[4728]: I0125 06:34:21.329383 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:34:21 crc kubenswrapper[4728]: E0125 06:34:21.330264 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:34:22 crc kubenswrapper[4728]: I0125 06:34:22.378486 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:22 crc kubenswrapper[4728]: I0125 06:34:22.378827 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:22 crc kubenswrapper[4728]: I0125 06:34:22.417923 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:22 crc kubenswrapper[4728]: I0125 06:34:22.990634 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:23 crc kubenswrapper[4728]: I0125 06:34:23.037871 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nrd"] Jan 25 06:34:24 crc kubenswrapper[4728]: I0125 06:34:24.962844 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-97nrd" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerName="registry-server" containerID="cri-o://64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6" gracePeriod=2 Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.362892 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.490561 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-utilities\") pod \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.491144 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-catalog-content\") pod \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.491306 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq7nb\" (UniqueName: \"kubernetes.io/projected/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-kube-api-access-lq7nb\") pod \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\" (UID: \"71e36fa8-a765-4da9-b96d-8909ff1dfdc0\") " Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.491372 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-utilities" (OuterVolumeSpecName: "utilities") pod "71e36fa8-a765-4da9-b96d-8909ff1dfdc0" (UID: "71e36fa8-a765-4da9-b96d-8909ff1dfdc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.491765 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.505005 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-kube-api-access-lq7nb" (OuterVolumeSpecName: "kube-api-access-lq7nb") pod "71e36fa8-a765-4da9-b96d-8909ff1dfdc0" (UID: "71e36fa8-a765-4da9-b96d-8909ff1dfdc0"). InnerVolumeSpecName "kube-api-access-lq7nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.507534 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71e36fa8-a765-4da9-b96d-8909ff1dfdc0" (UID: "71e36fa8-a765-4da9-b96d-8909ff1dfdc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.595397 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.595426 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq7nb\" (UniqueName: \"kubernetes.io/projected/71e36fa8-a765-4da9-b96d-8909ff1dfdc0-kube-api-access-lq7nb\") on node \"crc\" DevicePath \"\"" Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.978432 4728 generic.go:334] "Generic (PLEG): container finished" podID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerID="64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6" exitCode=0 Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.978508 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nrd" event={"ID":"71e36fa8-a765-4da9-b96d-8909ff1dfdc0","Type":"ContainerDied","Data":"64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6"} Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.978528 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97nrd" Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.978560 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97nrd" event={"ID":"71e36fa8-a765-4da9-b96d-8909ff1dfdc0","Type":"ContainerDied","Data":"0f491b6f1954034593b9fc0bf3967e669a19b2e3f8fce7c54bf2cfe7f2860c85"} Jan 25 06:34:25 crc kubenswrapper[4728]: I0125 06:34:25.978589 4728 scope.go:117] "RemoveContainer" containerID="64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6" Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.001793 4728 scope.go:117] "RemoveContainer" containerID="fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed" Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.008343 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nrd"] Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.019715 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-97nrd"] Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.028765 4728 scope.go:117] "RemoveContainer" containerID="d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839" Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.050890 4728 scope.go:117] "RemoveContainer" containerID="64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6" Jan 25 06:34:26 crc kubenswrapper[4728]: E0125 06:34:26.051615 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6\": container with ID starting with 64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6 not found: ID does not exist" containerID="64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6" Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.051647 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6"} err="failed to get container status \"64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6\": rpc error: code = NotFound desc = could not find container \"64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6\": container with ID starting with 64954c50d1671352efc971a34726ff8aa6d8da5c2fa90e046b0e03fa3c8a4bd6 not found: ID does not exist" Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.051671 4728 scope.go:117] "RemoveContainer" containerID="fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed" Jan 25 06:34:26 crc kubenswrapper[4728]: E0125 06:34:26.051886 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed\": container with ID starting with fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed not found: ID does not exist" containerID="fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed" Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.051911 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed"} err="failed to get container status \"fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed\": rpc error: code = NotFound desc = could not find container \"fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed\": container with ID starting with fc205ec7fe85f84b6d8c8a9285a954009aec57abb7e451578ad312cbcb1f20ed not found: ID does not exist" Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.051927 4728 scope.go:117] "RemoveContainer" containerID="d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839" Jan 25 06:34:26 crc kubenswrapper[4728]: E0125 06:34:26.052099 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839\": container with ID starting with d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839 not found: ID does not exist" containerID="d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839" Jan 25 06:34:26 crc kubenswrapper[4728]: I0125 06:34:26.052118 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839"} err="failed to get container status \"d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839\": rpc error: code = NotFound desc = could not find container \"d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839\": container with ID starting with d67cf1a266fc0123308aa6cc358bfebf4f12a8a52ba1e7954b8f3729f018a839 not found: ID does not exist" Jan 25 06:34:27 crc kubenswrapper[4728]: I0125 06:34:27.337484 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" path="/var/lib/kubelet/pods/71e36fa8-a765-4da9-b96d-8909ff1dfdc0/volumes" Jan 25 06:34:32 crc kubenswrapper[4728]: I0125 06:34:32.329995 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:34:32 crc kubenswrapper[4728]: E0125 06:34:32.331213 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:34:36 crc kubenswrapper[4728]: I0125 06:34:36.573526 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-fdm97_d9fe7f50-6608-4a79-81f9-bdf8290d9d90/kube-rbac-proxy/0.log" Jan 25 06:34:36 crc kubenswrapper[4728]: I0125 06:34:36.643445 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-fdm97_d9fe7f50-6608-4a79-81f9-bdf8290d9d90/controller/0.log" Jan 25 06:34:36 crc kubenswrapper[4728]: I0125 06:34:36.756771 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-frr-files/0.log" Jan 25 06:34:36 crc kubenswrapper[4728]: I0125 06:34:36.916870 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-reloader/0.log" Jan 25 06:34:36 crc kubenswrapper[4728]: I0125 06:34:36.917978 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-frr-files/0.log" Jan 25 06:34:36 crc kubenswrapper[4728]: I0125 06:34:36.950856 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-metrics/0.log" Jan 25 06:34:36 crc kubenswrapper[4728]: I0125 06:34:36.954806 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-reloader/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.206647 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-frr-files/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.243535 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-metrics/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.263820 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-reloader/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.293418 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-metrics/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.406064 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-reloader/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.406752 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-frr-files/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.441978 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-metrics/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.447209 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/controller/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.578536 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/frr-metrics/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.622509 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/kube-rbac-proxy-frr/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.625420 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/kube-rbac-proxy/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.771599 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/reloader/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.876564 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-9r8h8_a77d36cb-ae0e-41c8-98be-85563d52e02c/frr-k8s-webhook-server/0.log" Jan 25 06:34:37 crc kubenswrapper[4728]: I0125 06:34:37.999101 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b95c97db5-42zxg_3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef/manager/0.log" Jan 25 06:34:38 crc kubenswrapper[4728]: I0125 06:34:38.114183 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6bb64544-ttk7k_0acd04dc-1416-47ec-97a0-f999c55e5efb/webhook-server/0.log" Jan 25 06:34:38 crc kubenswrapper[4728]: I0125 06:34:38.282330 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zrgnp_44533cf6-d8b9-4376-8aad-372d74dbeecd/kube-rbac-proxy/0.log" Jan 25 06:34:38 crc kubenswrapper[4728]: I0125 06:34:38.697038 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zrgnp_44533cf6-d8b9-4376-8aad-372d74dbeecd/speaker/0.log" Jan 25 06:34:38 crc kubenswrapper[4728]: I0125 06:34:38.721216 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/frr/0.log" Jan 25 06:34:44 crc kubenswrapper[4728]: I0125 06:34:44.328487 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:34:44 crc kubenswrapper[4728]: E0125 06:34:44.329101 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.315174 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/util/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.484834 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/pull/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.511391 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/pull/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.519090 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/util/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.620936 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/util/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.657178 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/pull/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.660594 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/extract/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.777072 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/util/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.912017 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/util/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.920606 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/pull/0.log" Jan 25 06:34:49 crc kubenswrapper[4728]: I0125 06:34:49.927357 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/pull/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.060854 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/pull/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.071539 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/util/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.073677 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/extract/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.204123 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-utilities/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.385466 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-utilities/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.398397 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-content/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.404221 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-content/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.542002 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-utilities/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.545725 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-content/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.783861 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-utilities/0.log" Jan 25 06:34:50 crc kubenswrapper[4728]: I0125 06:34:50.968818 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-content/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.010296 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-content/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.034824 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-utilities/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.112394 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/registry-server/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.214590 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-utilities/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.248892 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-content/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.444594 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z726j_a549470e-be48-449d-b3e8-0caa23a23ee5/marketplace-operator/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.471160 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/registry-server/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.548200 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-utilities/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.727103 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-content/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.739299 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-content/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.752244 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-utilities/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.871940 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-utilities/0.log" Jan 25 06:34:51 crc kubenswrapper[4728]: I0125 06:34:51.897880 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-content/0.log" Jan 25 06:34:52 crc kubenswrapper[4728]: I0125 06:34:52.054750 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/registry-server/0.log" Jan 25 06:34:52 crc kubenswrapper[4728]: I0125 06:34:52.087610 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-utilities/0.log" Jan 25 06:34:52 crc kubenswrapper[4728]: I0125 06:34:52.199139 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-content/0.log" Jan 25 06:34:52 crc kubenswrapper[4728]: I0125 06:34:52.216419 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-utilities/0.log" Jan 25 06:34:52 crc kubenswrapper[4728]: I0125 06:34:52.217494 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-content/0.log" Jan 25 06:34:52 crc kubenswrapper[4728]: I0125 06:34:52.408035 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-content/0.log" Jan 25 06:34:52 crc kubenswrapper[4728]: I0125 06:34:52.418600 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-utilities/0.log" Jan 25 06:34:52 crc kubenswrapper[4728]: I0125 06:34:52.784178 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/registry-server/0.log" Jan 25 06:34:56 crc kubenswrapper[4728]: I0125 06:34:56.329726 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:34:56 crc kubenswrapper[4728]: E0125 06:34:56.330601 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:35:09 crc kubenswrapper[4728]: I0125 06:35:09.334525 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:35:09 crc kubenswrapper[4728]: E0125 06:35:09.335515 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:35:20 crc kubenswrapper[4728]: I0125 06:35:20.329816 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:35:20 crc kubenswrapper[4728]: E0125 06:35:20.331147 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:35:33 crc kubenswrapper[4728]: I0125 06:35:33.328698 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:35:33 crc kubenswrapper[4728]: E0125 06:35:33.329588 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:35:44 crc kubenswrapper[4728]: I0125 06:35:44.329288 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:35:44 crc kubenswrapper[4728]: I0125 06:35:44.592733 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"2c6cf23d9a194e48566ef43c8f953376fac69e4557ba6f445f007017a1a37521"} Jan 25 06:36:15 crc kubenswrapper[4728]: I0125 06:36:15.882017 4728 generic.go:334] "Generic (PLEG): container finished" podID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerID="c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05" exitCode=0 Jan 25 06:36:15 crc kubenswrapper[4728]: I0125 06:36:15.882132 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2mzgl/must-gather-xhx78" event={"ID":"167b706d-7d8c-46ae-a5ad-14c70b7f6948","Type":"ContainerDied","Data":"c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05"} Jan 25 06:36:15 crc kubenswrapper[4728]: I0125 06:36:15.883235 4728 scope.go:117] "RemoveContainer" containerID="c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05" Jan 25 06:36:16 crc kubenswrapper[4728]: I0125 06:36:16.666068 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2mzgl_must-gather-xhx78_167b706d-7d8c-46ae-a5ad-14c70b7f6948/gather/0.log" Jan 25 06:36:23 crc kubenswrapper[4728]: I0125 06:36:23.879816 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2mzgl/must-gather-xhx78"] Jan 25 06:36:23 crc kubenswrapper[4728]: I0125 06:36:23.880787 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2mzgl/must-gather-xhx78" podUID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerName="copy" containerID="cri-o://5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213" gracePeriod=2 Jan 25 06:36:23 crc kubenswrapper[4728]: I0125 06:36:23.891896 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2mzgl/must-gather-xhx78"] Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.238793 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2mzgl_must-gather-xhx78_167b706d-7d8c-46ae-a5ad-14c70b7f6948/copy/0.log" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.239771 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.410720 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/167b706d-7d8c-46ae-a5ad-14c70b7f6948-must-gather-output\") pod \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\" (UID: \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\") " Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.410935 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhwgr\" (UniqueName: \"kubernetes.io/projected/167b706d-7d8c-46ae-a5ad-14c70b7f6948-kube-api-access-nhwgr\") pod \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\" (UID: \"167b706d-7d8c-46ae-a5ad-14c70b7f6948\") " Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.428777 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/167b706d-7d8c-46ae-a5ad-14c70b7f6948-kube-api-access-nhwgr" (OuterVolumeSpecName: "kube-api-access-nhwgr") pod "167b706d-7d8c-46ae-a5ad-14c70b7f6948" (UID: "167b706d-7d8c-46ae-a5ad-14c70b7f6948"). InnerVolumeSpecName "kube-api-access-nhwgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.514648 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhwgr\" (UniqueName: \"kubernetes.io/projected/167b706d-7d8c-46ae-a5ad-14c70b7f6948-kube-api-access-nhwgr\") on node \"crc\" DevicePath \"\"" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.555832 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/167b706d-7d8c-46ae-a5ad-14c70b7f6948-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "167b706d-7d8c-46ae-a5ad-14c70b7f6948" (UID: "167b706d-7d8c-46ae-a5ad-14c70b7f6948"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.617631 4728 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/167b706d-7d8c-46ae-a5ad-14c70b7f6948-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.967077 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2mzgl_must-gather-xhx78_167b706d-7d8c-46ae-a5ad-14c70b7f6948/copy/0.log" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.967826 4728 generic.go:334] "Generic (PLEG): container finished" podID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerID="5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213" exitCode=143 Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.967906 4728 scope.go:117] "RemoveContainer" containerID="5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.967942 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2mzgl/must-gather-xhx78" Jan 25 06:36:24 crc kubenswrapper[4728]: I0125 06:36:24.993613 4728 scope.go:117] "RemoveContainer" containerID="c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05" Jan 25 06:36:25 crc kubenswrapper[4728]: I0125 06:36:25.042642 4728 scope.go:117] "RemoveContainer" containerID="5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213" Jan 25 06:36:25 crc kubenswrapper[4728]: E0125 06:36:25.043095 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213\": container with ID starting with 5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213 not found: ID does not exist" containerID="5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213" Jan 25 06:36:25 crc kubenswrapper[4728]: I0125 06:36:25.043136 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213"} err="failed to get container status \"5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213\": rpc error: code = NotFound desc = could not find container \"5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213\": container with ID starting with 5db628ebbd0d0866af0b7ecf9c8a29ceb35ad8cf1a4ed9421e68303550836213 not found: ID does not exist" Jan 25 06:36:25 crc kubenswrapper[4728]: I0125 06:36:25.043161 4728 scope.go:117] "RemoveContainer" containerID="c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05" Jan 25 06:36:25 crc kubenswrapper[4728]: E0125 06:36:25.043635 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05\": container with ID starting with c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05 not found: ID does not exist" containerID="c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05" Jan 25 06:36:25 crc kubenswrapper[4728]: I0125 06:36:25.043665 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05"} err="failed to get container status \"c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05\": rpc error: code = NotFound desc = could not find container \"c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05\": container with ID starting with c561b27ed1240ddc27a8a21808508b712088863d1dad15dd1933be03f500de05 not found: ID does not exist" Jan 25 06:36:25 crc kubenswrapper[4728]: I0125 06:36:25.337497 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" path="/var/lib/kubelet/pods/167b706d-7d8c-46ae-a5ad-14c70b7f6948/volumes" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.169820 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdvhb"] Jan 25 06:38:02 crc kubenswrapper[4728]: E0125 06:38:02.170875 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerName="registry-server" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.170891 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerName="registry-server" Jan 25 06:38:02 crc kubenswrapper[4728]: E0125 06:38:02.170914 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerName="extract-utilities" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.170921 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerName="extract-utilities" Jan 25 06:38:02 crc kubenswrapper[4728]: E0125 06:38:02.170940 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerName="copy" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.170946 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerName="copy" Jan 25 06:38:02 crc kubenswrapper[4728]: E0125 06:38:02.170967 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerName="extract-content" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.170972 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerName="extract-content" Jan 25 06:38:02 crc kubenswrapper[4728]: E0125 06:38:02.170984 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerName="gather" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.170990 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerName="gather" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.171227 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerName="gather" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.171242 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="71e36fa8-a765-4da9-b96d-8909ff1dfdc0" containerName="registry-server" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.171258 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="167b706d-7d8c-46ae-a5ad-14c70b7f6948" containerName="copy" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.172624 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.181126 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdvhb"] Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.317506 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmbrs\" (UniqueName: \"kubernetes.io/projected/0fff5ce9-b339-464b-ba31-b7ac2d358685-kube-api-access-wmbrs\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.317565 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-catalog-content\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.317607 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-utilities\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.420265 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmbrs\" (UniqueName: \"kubernetes.io/projected/0fff5ce9-b339-464b-ba31-b7ac2d358685-kube-api-access-wmbrs\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.420368 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-catalog-content\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.420432 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-utilities\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.421071 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-utilities\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.421064 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-catalog-content\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.450038 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmbrs\" (UniqueName: \"kubernetes.io/projected/0fff5ce9-b339-464b-ba31-b7ac2d358685-kube-api-access-wmbrs\") pod \"certified-operators-hdvhb\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.494067 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:02 crc kubenswrapper[4728]: I0125 06:38:02.924373 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdvhb"] Jan 25 06:38:03 crc kubenswrapper[4728]: I0125 06:38:03.944556 4728 generic.go:334] "Generic (PLEG): container finished" podID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerID="c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22" exitCode=0 Jan 25 06:38:03 crc kubenswrapper[4728]: I0125 06:38:03.944663 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdvhb" event={"ID":"0fff5ce9-b339-464b-ba31-b7ac2d358685","Type":"ContainerDied","Data":"c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22"} Jan 25 06:38:03 crc kubenswrapper[4728]: I0125 06:38:03.944910 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdvhb" event={"ID":"0fff5ce9-b339-464b-ba31-b7ac2d358685","Type":"ContainerStarted","Data":"80e74403294357e8f79e44241f1954d17e61b4ed830f8f07d68b1f0969b7c9cd"} Jan 25 06:38:03 crc kubenswrapper[4728]: I0125 06:38:03.947665 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 25 06:38:04 crc kubenswrapper[4728]: I0125 06:38:04.953816 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdvhb" event={"ID":"0fff5ce9-b339-464b-ba31-b7ac2d358685","Type":"ContainerStarted","Data":"e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c"} Jan 25 06:38:05 crc kubenswrapper[4728]: I0125 06:38:05.965680 4728 generic.go:334] "Generic (PLEG): container finished" podID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerID="e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c" exitCode=0 Jan 25 06:38:05 crc kubenswrapper[4728]: I0125 06:38:05.965808 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdvhb" event={"ID":"0fff5ce9-b339-464b-ba31-b7ac2d358685","Type":"ContainerDied","Data":"e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c"} Jan 25 06:38:06 crc kubenswrapper[4728]: I0125 06:38:06.988701 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdvhb" event={"ID":"0fff5ce9-b339-464b-ba31-b7ac2d358685","Type":"ContainerStarted","Data":"4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f"} Jan 25 06:38:07 crc kubenswrapper[4728]: I0125 06:38:07.015411 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdvhb" podStartSLOduration=2.434978861 podStartE2EDuration="5.01539555s" podCreationTimestamp="2026-01-25 06:38:02 +0000 UTC" firstStartedPulling="2026-01-25 06:38:03.947354574 +0000 UTC m=+3574.983232545" lastFinishedPulling="2026-01-25 06:38:06.527771254 +0000 UTC m=+3577.563649234" observedRunningTime="2026-01-25 06:38:07.005889497 +0000 UTC m=+3578.041767476" watchObservedRunningTime="2026-01-25 06:38:07.01539555 +0000 UTC m=+3578.051273519" Jan 25 06:38:12 crc kubenswrapper[4728]: I0125 06:38:12.494988 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:12 crc kubenswrapper[4728]: I0125 06:38:12.495416 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:12 crc kubenswrapper[4728]: I0125 06:38:12.535468 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:12 crc kubenswrapper[4728]: I0125 06:38:12.899282 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:38:12 crc kubenswrapper[4728]: I0125 06:38:12.899420 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:38:13 crc kubenswrapper[4728]: I0125 06:38:13.085120 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:13 crc kubenswrapper[4728]: I0125 06:38:13.140609 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdvhb"] Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.067261 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hdvhb" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerName="registry-server" containerID="cri-o://4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f" gracePeriod=2 Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.446384 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.586526 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmbrs\" (UniqueName: \"kubernetes.io/projected/0fff5ce9-b339-464b-ba31-b7ac2d358685-kube-api-access-wmbrs\") pod \"0fff5ce9-b339-464b-ba31-b7ac2d358685\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.586805 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-catalog-content\") pod \"0fff5ce9-b339-464b-ba31-b7ac2d358685\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.587018 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-utilities\") pod \"0fff5ce9-b339-464b-ba31-b7ac2d358685\" (UID: \"0fff5ce9-b339-464b-ba31-b7ac2d358685\") " Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.587733 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-utilities" (OuterVolumeSpecName: "utilities") pod "0fff5ce9-b339-464b-ba31-b7ac2d358685" (UID: "0fff5ce9-b339-464b-ba31-b7ac2d358685"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.588192 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.594459 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fff5ce9-b339-464b-ba31-b7ac2d358685-kube-api-access-wmbrs" (OuterVolumeSpecName: "kube-api-access-wmbrs") pod "0fff5ce9-b339-464b-ba31-b7ac2d358685" (UID: "0fff5ce9-b339-464b-ba31-b7ac2d358685"). InnerVolumeSpecName "kube-api-access-wmbrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.624594 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fff5ce9-b339-464b-ba31-b7ac2d358685" (UID: "0fff5ce9-b339-464b-ba31-b7ac2d358685"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.689767 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmbrs\" (UniqueName: \"kubernetes.io/projected/0fff5ce9-b339-464b-ba31-b7ac2d358685-kube-api-access-wmbrs\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:15 crc kubenswrapper[4728]: I0125 06:38:15.689801 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fff5ce9-b339-464b-ba31-b7ac2d358685-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.080400 4728 generic.go:334] "Generic (PLEG): container finished" podID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerID="4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f" exitCode=0 Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.080487 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdvhb" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.080501 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdvhb" event={"ID":"0fff5ce9-b339-464b-ba31-b7ac2d358685","Type":"ContainerDied","Data":"4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f"} Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.080603 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdvhb" event={"ID":"0fff5ce9-b339-464b-ba31-b7ac2d358685","Type":"ContainerDied","Data":"80e74403294357e8f79e44241f1954d17e61b4ed830f8f07d68b1f0969b7c9cd"} Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.080632 4728 scope.go:117] "RemoveContainer" containerID="4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.098485 4728 scope.go:117] "RemoveContainer" containerID="e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.117285 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdvhb"] Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.127490 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hdvhb"] Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.127519 4728 scope.go:117] "RemoveContainer" containerID="c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.157551 4728 scope.go:117] "RemoveContainer" containerID="4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f" Jan 25 06:38:16 crc kubenswrapper[4728]: E0125 06:38:16.157969 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f\": container with ID starting with 4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f not found: ID does not exist" containerID="4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.158029 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f"} err="failed to get container status \"4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f\": rpc error: code = NotFound desc = could not find container \"4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f\": container with ID starting with 4ae5941e9d3582f44d64509a08fa1827ceabd53d169febc1b148e6104958fc2f not found: ID does not exist" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.158077 4728 scope.go:117] "RemoveContainer" containerID="e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c" Jan 25 06:38:16 crc kubenswrapper[4728]: E0125 06:38:16.158583 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c\": container with ID starting with e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c not found: ID does not exist" containerID="e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.158664 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c"} err="failed to get container status \"e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c\": rpc error: code = NotFound desc = could not find container \"e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c\": container with ID starting with e8cd789449b777af5fcb759700a2f4557775ab4fa4e3e51a9508259043369a5c not found: ID does not exist" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.158999 4728 scope.go:117] "RemoveContainer" containerID="c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22" Jan 25 06:38:16 crc kubenswrapper[4728]: E0125 06:38:16.159539 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22\": container with ID starting with c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22 not found: ID does not exist" containerID="c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22" Jan 25 06:38:16 crc kubenswrapper[4728]: I0125 06:38:16.159589 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22"} err="failed to get container status \"c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22\": rpc error: code = NotFound desc = could not find container \"c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22\": container with ID starting with c63d252c4ef301c5c767c4594bd34cc34f72eae8232173bb9dd0b00f1d2e8e22 not found: ID does not exist" Jan 25 06:38:17 crc kubenswrapper[4728]: I0125 06:38:17.338581 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" path="/var/lib/kubelet/pods/0fff5ce9-b339-464b-ba31-b7ac2d358685/volumes" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.564660 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-89s2s/must-gather-b4vcn"] Jan 25 06:38:37 crc kubenswrapper[4728]: E0125 06:38:37.565573 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerName="extract-utilities" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.565589 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerName="extract-utilities" Jan 25 06:38:37 crc kubenswrapper[4728]: E0125 06:38:37.565613 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerName="extract-content" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.565619 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerName="extract-content" Jan 25 06:38:37 crc kubenswrapper[4728]: E0125 06:38:37.565647 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerName="registry-server" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.565655 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerName="registry-server" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.565855 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fff5ce9-b339-464b-ba31-b7ac2d358685" containerName="registry-server" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.566723 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.569232 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-89s2s"/"openshift-service-ca.crt" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.569459 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-89s2s"/"kube-root-ca.crt" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.582237 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-89s2s/must-gather-b4vcn"] Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.662658 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqrl7\" (UniqueName: \"kubernetes.io/projected/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-kube-api-access-hqrl7\") pod \"must-gather-b4vcn\" (UID: \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\") " pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.662811 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-must-gather-output\") pod \"must-gather-b4vcn\" (UID: \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\") " pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.764787 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-must-gather-output\") pod \"must-gather-b4vcn\" (UID: \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\") " pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.764893 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqrl7\" (UniqueName: \"kubernetes.io/projected/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-kube-api-access-hqrl7\") pod \"must-gather-b4vcn\" (UID: \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\") " pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.765265 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-must-gather-output\") pod \"must-gather-b4vcn\" (UID: \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\") " pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.784094 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqrl7\" (UniqueName: \"kubernetes.io/projected/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-kube-api-access-hqrl7\") pod \"must-gather-b4vcn\" (UID: \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\") " pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:38:37 crc kubenswrapper[4728]: I0125 06:38:37.917981 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:38:38 crc kubenswrapper[4728]: I0125 06:38:38.325236 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-89s2s/must-gather-b4vcn"] Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.230192 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cf954"] Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.233085 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.244430 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cf954"] Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.284422 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/must-gather-b4vcn" event={"ID":"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2","Type":"ContainerStarted","Data":"1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1"} Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.284481 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/must-gather-b4vcn" event={"ID":"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2","Type":"ContainerStarted","Data":"dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6"} Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.284501 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/must-gather-b4vcn" event={"ID":"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2","Type":"ContainerStarted","Data":"65b3c6052a54bcaebd990211985bb0cc9d0024b19fec8f2568cbbf06511b98c6"} Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.414891 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-utilities\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.414956 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njsn8\" (UniqueName: \"kubernetes.io/projected/5ff7b2b7-456e-4afb-8c39-295a562d0c70-kube-api-access-njsn8\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.415754 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-catalog-content\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.518379 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-utilities\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.518463 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njsn8\" (UniqueName: \"kubernetes.io/projected/5ff7b2b7-456e-4afb-8c39-295a562d0c70-kube-api-access-njsn8\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.518525 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-catalog-content\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.519446 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-catalog-content\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.519947 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-utilities\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.549183 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njsn8\" (UniqueName: \"kubernetes.io/projected/5ff7b2b7-456e-4afb-8c39-295a562d0c70-kube-api-access-njsn8\") pod \"community-operators-cf954\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:39 crc kubenswrapper[4728]: I0125 06:38:39.562716 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:40 crc kubenswrapper[4728]: I0125 06:38:40.050984 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-89s2s/must-gather-b4vcn" podStartSLOduration=3.050934953 podStartE2EDuration="3.050934953s" podCreationTimestamp="2026-01-25 06:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 06:38:39.302804241 +0000 UTC m=+3610.338682221" watchObservedRunningTime="2026-01-25 06:38:40.050934953 +0000 UTC m=+3611.086812933" Jan 25 06:38:40 crc kubenswrapper[4728]: I0125 06:38:40.056559 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cf954"] Jan 25 06:38:40 crc kubenswrapper[4728]: I0125 06:38:40.293650 4728 generic.go:334] "Generic (PLEG): container finished" podID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerID="af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4" exitCode=0 Jan 25 06:38:40 crc kubenswrapper[4728]: I0125 06:38:40.293844 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf954" event={"ID":"5ff7b2b7-456e-4afb-8c39-295a562d0c70","Type":"ContainerDied","Data":"af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4"} Jan 25 06:38:40 crc kubenswrapper[4728]: I0125 06:38:40.295147 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf954" event={"ID":"5ff7b2b7-456e-4afb-8c39-295a562d0c70","Type":"ContainerStarted","Data":"13155d1ef0618f00753f93bf4c998dacba8dede630b7957edbd28a4b763b79ee"} Jan 25 06:38:41 crc kubenswrapper[4728]: I0125 06:38:41.308621 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf954" event={"ID":"5ff7b2b7-456e-4afb-8c39-295a562d0c70","Type":"ContainerStarted","Data":"82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a"} Jan 25 06:38:41 crc kubenswrapper[4728]: I0125 06:38:41.840887 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-89s2s/crc-debug-cdk54"] Jan 25 06:38:41 crc kubenswrapper[4728]: I0125 06:38:41.842494 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:41 crc kubenswrapper[4728]: I0125 06:38:41.844225 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-89s2s"/"default-dockercfg-854c7" Jan 25 06:38:41 crc kubenswrapper[4728]: I0125 06:38:41.969884 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-host\") pod \"crc-debug-cdk54\" (UID: \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\") " pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:41 crc kubenswrapper[4728]: I0125 06:38:41.969933 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nn4x\" (UniqueName: \"kubernetes.io/projected/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-kube-api-access-5nn4x\") pod \"crc-debug-cdk54\" (UID: \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\") " pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.071421 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-host\") pod \"crc-debug-cdk54\" (UID: \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\") " pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.071513 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-host\") pod \"crc-debug-cdk54\" (UID: \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\") " pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.071551 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn4x\" (UniqueName: \"kubernetes.io/projected/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-kube-api-access-5nn4x\") pod \"crc-debug-cdk54\" (UID: \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\") " pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.091516 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn4x\" (UniqueName: \"kubernetes.io/projected/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-kube-api-access-5nn4x\") pod \"crc-debug-cdk54\" (UID: \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\") " pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.157569 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:42 crc kubenswrapper[4728]: W0125 06:38:42.181693 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e8d1e2b_2ad0_458e_b887_ced12e8d4d41.slice/crio-84f972d6f00453f8eac5da7534ac1fadfd38c84fe106835bb8ea6400832b5c54 WatchSource:0}: Error finding container 84f972d6f00453f8eac5da7534ac1fadfd38c84fe106835bb8ea6400832b5c54: Status 404 returned error can't find the container with id 84f972d6f00453f8eac5da7534ac1fadfd38c84fe106835bb8ea6400832b5c54 Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.319444 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/crc-debug-cdk54" event={"ID":"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41","Type":"ContainerStarted","Data":"84f972d6f00453f8eac5da7534ac1fadfd38c84fe106835bb8ea6400832b5c54"} Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.321985 4728 generic.go:334] "Generic (PLEG): container finished" podID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerID="82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a" exitCode=0 Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.322029 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf954" event={"ID":"5ff7b2b7-456e-4afb-8c39-295a562d0c70","Type":"ContainerDied","Data":"82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a"} Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.900444 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:38:42 crc kubenswrapper[4728]: I0125 06:38:42.900873 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:38:43 crc kubenswrapper[4728]: I0125 06:38:43.339405 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/crc-debug-cdk54" event={"ID":"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41","Type":"ContainerStarted","Data":"17c31edd7d9928a7fbabbf5a593bb8ab6feb91a4582c77162c04f678f0df1a52"} Jan 25 06:38:43 crc kubenswrapper[4728]: I0125 06:38:43.340936 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf954" event={"ID":"5ff7b2b7-456e-4afb-8c39-295a562d0c70","Type":"ContainerStarted","Data":"d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2"} Jan 25 06:38:43 crc kubenswrapper[4728]: I0125 06:38:43.352654 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-89s2s/crc-debug-cdk54" podStartSLOduration=2.352638398 podStartE2EDuration="2.352638398s" podCreationTimestamp="2026-01-25 06:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-25 06:38:43.350303136 +0000 UTC m=+3614.386181115" watchObservedRunningTime="2026-01-25 06:38:43.352638398 +0000 UTC m=+3614.388516378" Jan 25 06:38:43 crc kubenswrapper[4728]: I0125 06:38:43.369966 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cf954" podStartSLOduration=1.868585814 podStartE2EDuration="4.36994471s" podCreationTimestamp="2026-01-25 06:38:39 +0000 UTC" firstStartedPulling="2026-01-25 06:38:40.29693701 +0000 UTC m=+3611.332814989" lastFinishedPulling="2026-01-25 06:38:42.798295905 +0000 UTC m=+3613.834173885" observedRunningTime="2026-01-25 06:38:43.366402812 +0000 UTC m=+3614.402280791" watchObservedRunningTime="2026-01-25 06:38:43.36994471 +0000 UTC m=+3614.405822690" Jan 25 06:38:49 crc kubenswrapper[4728]: I0125 06:38:49.563708 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:49 crc kubenswrapper[4728]: I0125 06:38:49.564338 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:49 crc kubenswrapper[4728]: I0125 06:38:49.608406 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:50 crc kubenswrapper[4728]: I0125 06:38:50.442783 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:50 crc kubenswrapper[4728]: I0125 06:38:50.484380 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cf954"] Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.409536 4728 generic.go:334] "Generic (PLEG): container finished" podID="1e8d1e2b-2ad0-458e-b887-ced12e8d4d41" containerID="17c31edd7d9928a7fbabbf5a593bb8ab6feb91a4582c77162c04f678f0df1a52" exitCode=0 Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.409603 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/crc-debug-cdk54" event={"ID":"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41","Type":"ContainerDied","Data":"17c31edd7d9928a7fbabbf5a593bb8ab6feb91a4582c77162c04f678f0df1a52"} Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.410069 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cf954" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerName="registry-server" containerID="cri-o://d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2" gracePeriod=2 Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.780666 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.853738 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-utilities\") pod \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.854003 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njsn8\" (UniqueName: \"kubernetes.io/projected/5ff7b2b7-456e-4afb-8c39-295a562d0c70-kube-api-access-njsn8\") pod \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.854072 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-catalog-content\") pod \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\" (UID: \"5ff7b2b7-456e-4afb-8c39-295a562d0c70\") " Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.854365 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-utilities" (OuterVolumeSpecName: "utilities") pod "5ff7b2b7-456e-4afb-8c39-295a562d0c70" (UID: "5ff7b2b7-456e-4afb-8c39-295a562d0c70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.854821 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-utilities\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.858701 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff7b2b7-456e-4afb-8c39-295a562d0c70-kube-api-access-njsn8" (OuterVolumeSpecName: "kube-api-access-njsn8") pod "5ff7b2b7-456e-4afb-8c39-295a562d0c70" (UID: "5ff7b2b7-456e-4afb-8c39-295a562d0c70"). InnerVolumeSpecName "kube-api-access-njsn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.894996 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ff7b2b7-456e-4afb-8c39-295a562d0c70" (UID: "5ff7b2b7-456e-4afb-8c39-295a562d0c70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.956758 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njsn8\" (UniqueName: \"kubernetes.io/projected/5ff7b2b7-456e-4afb-8c39-295a562d0c70-kube-api-access-njsn8\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:52 crc kubenswrapper[4728]: I0125 06:38:52.956949 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff7b2b7-456e-4afb-8c39-295a562d0c70-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.422746 4728 generic.go:334] "Generic (PLEG): container finished" podID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerID="d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2" exitCode=0 Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.422934 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cf954" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.422963 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf954" event={"ID":"5ff7b2b7-456e-4afb-8c39-295a562d0c70","Type":"ContainerDied","Data":"d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2"} Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.423340 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf954" event={"ID":"5ff7b2b7-456e-4afb-8c39-295a562d0c70","Type":"ContainerDied","Data":"13155d1ef0618f00753f93bf4c998dacba8dede630b7957edbd28a4b763b79ee"} Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.423386 4728 scope.go:117] "RemoveContainer" containerID="d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.480407 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.488905 4728 scope.go:117] "RemoveContainer" containerID="82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.504414 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cf954"] Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.536393 4728 scope.go:117] "RemoveContainer" containerID="af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.536512 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cf954"] Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.562178 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-89s2s/crc-debug-cdk54"] Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.579866 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nn4x\" (UniqueName: \"kubernetes.io/projected/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-kube-api-access-5nn4x\") pod \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\" (UID: \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\") " Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.580659 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-89s2s/crc-debug-cdk54"] Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.580820 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-host\") pod \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\" (UID: \"1e8d1e2b-2ad0-458e-b887-ced12e8d4d41\") " Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.581874 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-host" (OuterVolumeSpecName: "host") pod "1e8d1e2b-2ad0-458e-b887-ced12e8d4d41" (UID: "1e8d1e2b-2ad0-458e-b887-ced12e8d4d41"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.583635 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-host\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.609503 4728 scope.go:117] "RemoveContainer" containerID="d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2" Jan 25 06:38:53 crc kubenswrapper[4728]: E0125 06:38:53.610354 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2\": container with ID starting with d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2 not found: ID does not exist" containerID="d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.610405 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2"} err="failed to get container status \"d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2\": rpc error: code = NotFound desc = could not find container \"d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2\": container with ID starting with d861b556446a5ad6f6bf9b7f05f30f0c1c4868db8abb09834cb18d72c19e8fe2 not found: ID does not exist" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.610437 4728 scope.go:117] "RemoveContainer" containerID="82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a" Jan 25 06:38:53 crc kubenswrapper[4728]: E0125 06:38:53.610809 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a\": container with ID starting with 82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a not found: ID does not exist" containerID="82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.610854 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a"} err="failed to get container status \"82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a\": rpc error: code = NotFound desc = could not find container \"82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a\": container with ID starting with 82bd17d54ca766b6a96cbb5524dad4cbccdd4f890a817c88a050dda3ebe0e36a not found: ID does not exist" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.610886 4728 scope.go:117] "RemoveContainer" containerID="af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4" Jan 25 06:38:53 crc kubenswrapper[4728]: E0125 06:38:53.611208 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4\": container with ID starting with af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4 not found: ID does not exist" containerID="af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.611232 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4"} err="failed to get container status \"af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4\": rpc error: code = NotFound desc = could not find container \"af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4\": container with ID starting with af50fca96395061803f9519f498cff2827d8f6163fd5d94bc3336ad243c3cba4 not found: ID does not exist" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.612670 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-kube-api-access-5nn4x" (OuterVolumeSpecName: "kube-api-access-5nn4x") pod "1e8d1e2b-2ad0-458e-b887-ced12e8d4d41" (UID: "1e8d1e2b-2ad0-458e-b887-ced12e8d4d41"). InnerVolumeSpecName "kube-api-access-5nn4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:38:53 crc kubenswrapper[4728]: I0125 06:38:53.686583 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nn4x\" (UniqueName: \"kubernetes.io/projected/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41-kube-api-access-5nn4x\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.437262 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84f972d6f00453f8eac5da7534ac1fadfd38c84fe106835bb8ea6400832b5c54" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.437462 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/crc-debug-cdk54" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.656552 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-89s2s/crc-debug-mt2j8"] Jan 25 06:38:54 crc kubenswrapper[4728]: E0125 06:38:54.656983 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerName="extract-content" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.657001 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerName="extract-content" Jan 25 06:38:54 crc kubenswrapper[4728]: E0125 06:38:54.657021 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerName="extract-utilities" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.657027 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerName="extract-utilities" Jan 25 06:38:54 crc kubenswrapper[4728]: E0125 06:38:54.657048 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerName="registry-server" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.657065 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerName="registry-server" Jan 25 06:38:54 crc kubenswrapper[4728]: E0125 06:38:54.657086 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8d1e2b-2ad0-458e-b887-ced12e8d4d41" containerName="container-00" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.657093 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8d1e2b-2ad0-458e-b887-ced12e8d4d41" containerName="container-00" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.657311 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" containerName="registry-server" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.657344 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8d1e2b-2ad0-458e-b887-ced12e8d4d41" containerName="container-00" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.658003 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.659876 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-89s2s"/"default-dockercfg-854c7" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.813608 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-host\") pod \"crc-debug-mt2j8\" (UID: \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\") " pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.813909 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jbv\" (UniqueName: \"kubernetes.io/projected/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-kube-api-access-88jbv\") pod \"crc-debug-mt2j8\" (UID: \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\") " pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.916719 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jbv\" (UniqueName: \"kubernetes.io/projected/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-kube-api-access-88jbv\") pod \"crc-debug-mt2j8\" (UID: \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\") " pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.917165 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-host\") pod \"crc-debug-mt2j8\" (UID: \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\") " pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.917299 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-host\") pod \"crc-debug-mt2j8\" (UID: \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\") " pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.934908 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jbv\" (UniqueName: \"kubernetes.io/projected/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-kube-api-access-88jbv\") pod \"crc-debug-mt2j8\" (UID: \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\") " pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:54 crc kubenswrapper[4728]: I0125 06:38:54.974807 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:55 crc kubenswrapper[4728]: W0125 06:38:55.004950 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb88cdb1b_50b6_4be8_9bd0_c11d2a431960.slice/crio-b5d2d03670c2bbe073e59e7747d6eb74736a93c249c21e9fbdb708b3bd165c4a WatchSource:0}: Error finding container b5d2d03670c2bbe073e59e7747d6eb74736a93c249c21e9fbdb708b3bd165c4a: Status 404 returned error can't find the container with id b5d2d03670c2bbe073e59e7747d6eb74736a93c249c21e9fbdb708b3bd165c4a Jan 25 06:38:55 crc kubenswrapper[4728]: I0125 06:38:55.338686 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8d1e2b-2ad0-458e-b887-ced12e8d4d41" path="/var/lib/kubelet/pods/1e8d1e2b-2ad0-458e-b887-ced12e8d4d41/volumes" Jan 25 06:38:55 crc kubenswrapper[4728]: I0125 06:38:55.339548 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff7b2b7-456e-4afb-8c39-295a562d0c70" path="/var/lib/kubelet/pods/5ff7b2b7-456e-4afb-8c39-295a562d0c70/volumes" Jan 25 06:38:55 crc kubenswrapper[4728]: I0125 06:38:55.446941 4728 generic.go:334] "Generic (PLEG): container finished" podID="b88cdb1b-50b6-4be8-9bd0-c11d2a431960" containerID="4451d0b3428db0f540908d6eec7c876297aab74f3a2936fd9f756457b3fc7979" exitCode=1 Jan 25 06:38:55 crc kubenswrapper[4728]: I0125 06:38:55.446988 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/crc-debug-mt2j8" event={"ID":"b88cdb1b-50b6-4be8-9bd0-c11d2a431960","Type":"ContainerDied","Data":"4451d0b3428db0f540908d6eec7c876297aab74f3a2936fd9f756457b3fc7979"} Jan 25 06:38:55 crc kubenswrapper[4728]: I0125 06:38:55.447014 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/crc-debug-mt2j8" event={"ID":"b88cdb1b-50b6-4be8-9bd0-c11d2a431960","Type":"ContainerStarted","Data":"b5d2d03670c2bbe073e59e7747d6eb74736a93c249c21e9fbdb708b3bd165c4a"} Jan 25 06:38:55 crc kubenswrapper[4728]: I0125 06:38:55.483635 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-89s2s/crc-debug-mt2j8"] Jan 25 06:38:55 crc kubenswrapper[4728]: I0125 06:38:55.491969 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-89s2s/crc-debug-mt2j8"] Jan 25 06:38:56 crc kubenswrapper[4728]: I0125 06:38:56.530098 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:38:56 crc kubenswrapper[4728]: I0125 06:38:56.657778 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-host\") pod \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\" (UID: \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\") " Jan 25 06:38:56 crc kubenswrapper[4728]: I0125 06:38:56.657842 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jbv\" (UniqueName: \"kubernetes.io/projected/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-kube-api-access-88jbv\") pod \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\" (UID: \"b88cdb1b-50b6-4be8-9bd0-c11d2a431960\") " Jan 25 06:38:56 crc kubenswrapper[4728]: I0125 06:38:56.657873 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-host" (OuterVolumeSpecName: "host") pod "b88cdb1b-50b6-4be8-9bd0-c11d2a431960" (UID: "b88cdb1b-50b6-4be8-9bd0-c11d2a431960"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 25 06:38:56 crc kubenswrapper[4728]: I0125 06:38:56.658239 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-host\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:56 crc kubenswrapper[4728]: I0125 06:38:56.664629 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-kube-api-access-88jbv" (OuterVolumeSpecName: "kube-api-access-88jbv") pod "b88cdb1b-50b6-4be8-9bd0-c11d2a431960" (UID: "b88cdb1b-50b6-4be8-9bd0-c11d2a431960"). InnerVolumeSpecName "kube-api-access-88jbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:38:56 crc kubenswrapper[4728]: I0125 06:38:56.760485 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jbv\" (UniqueName: \"kubernetes.io/projected/b88cdb1b-50b6-4be8-9bd0-c11d2a431960-kube-api-access-88jbv\") on node \"crc\" DevicePath \"\"" Jan 25 06:38:57 crc kubenswrapper[4728]: I0125 06:38:57.344859 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88cdb1b-50b6-4be8-9bd0-c11d2a431960" path="/var/lib/kubelet/pods/b88cdb1b-50b6-4be8-9bd0-c11d2a431960/volumes" Jan 25 06:38:57 crc kubenswrapper[4728]: I0125 06:38:57.467470 4728 scope.go:117] "RemoveContainer" containerID="4451d0b3428db0f540908d6eec7c876297aab74f3a2936fd9f756457b3fc7979" Jan 25 06:38:57 crc kubenswrapper[4728]: I0125 06:38:57.467517 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/crc-debug-mt2j8" Jan 25 06:39:12 crc kubenswrapper[4728]: I0125 06:39:12.899458 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:39:12 crc kubenswrapper[4728]: I0125 06:39:12.900121 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:39:12 crc kubenswrapper[4728]: I0125 06:39:12.900186 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 06:39:12 crc kubenswrapper[4728]: I0125 06:39:12.900787 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c6cf23d9a194e48566ef43c8f953376fac69e4557ba6f445f007017a1a37521"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 06:39:12 crc kubenswrapper[4728]: I0125 06:39:12.900833 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://2c6cf23d9a194e48566ef43c8f953376fac69e4557ba6f445f007017a1a37521" gracePeriod=600 Jan 25 06:39:13 crc kubenswrapper[4728]: I0125 06:39:13.620884 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="2c6cf23d9a194e48566ef43c8f953376fac69e4557ba6f445f007017a1a37521" exitCode=0 Jan 25 06:39:13 crc kubenswrapper[4728]: I0125 06:39:13.620954 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"2c6cf23d9a194e48566ef43c8f953376fac69e4557ba6f445f007017a1a37521"} Jan 25 06:39:13 crc kubenswrapper[4728]: I0125 06:39:13.621694 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerStarted","Data":"d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6"} Jan 25 06:39:13 crc kubenswrapper[4728]: I0125 06:39:13.621729 4728 scope.go:117] "RemoveContainer" containerID="3ac95280fc95e45abb268a40334764b5c05d32e3a688f4a99c725381b702280c" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.287336 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b8466696b-psmv9_36477670-fd4c-4015-8fab-b7608c72a906/barbican-api/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.396685 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b8466696b-psmv9_36477670-fd4c-4015-8fab-b7608c72a906/barbican-api-log/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.431896 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b587b754-6vhmj_4564c893-fa22-47c0-92b9-4d503b3553ee/barbican-keystone-listener/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.483185 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b587b754-6vhmj_4564c893-fa22-47c0-92b9-4d503b3553ee/barbican-keystone-listener-log/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.575638 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b49f9f9b7-2ld95_fd2e0efe-0434-4103-a08d-d014f69addf6/barbican-worker/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.594601 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b49f9f9b7-2ld95_fd2e0efe-0434-4103-a08d-d014f69addf6/barbican-worker-log/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.733217 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ch7jd_18badfbd-fe91-4d6e-8ecd-765ed6994030/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.792561 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_05cf94b4-4884-4e05-9036-3676fb8aedcb/ceilometer-central-agent/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.872024 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_05cf94b4-4884-4e05-9036-3676fb8aedcb/ceilometer-notification-agent/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.900193 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_05cf94b4-4884-4e05-9036-3676fb8aedcb/proxy-httpd/0.log" Jan 25 06:39:38 crc kubenswrapper[4728]: I0125 06:39:38.949617 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_05cf94b4-4884-4e05-9036-3676fb8aedcb/sg-core/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.059241 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b9e0ee7f-ce38-4e4f-afe0-993551ae84a8/cinder-api-log/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.248121 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e6ab997e-b648-4c08-9ba9-4166b43ebde2/cinder-scheduler/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.274121 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e6ab997e-b648-4c08-9ba9-4166b43ebde2/probe/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.306438 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b9e0ee7f-ce38-4e4f-afe0-993551ae84a8/cinder-api/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.426352 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2c4ts_022da454-0c7e-4950-9147-f13a2f725b47/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.516222 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lz5qs_c70ce086-a2f9-4979-b292-aa69dc5f9bc3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.567805 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cf5bfd7f-hhq75_2e967299-2864-48a8-ba27-7d2a63f66c43/init/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.713498 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cf5bfd7f-hhq75_2e967299-2864-48a8-ba27-7d2a63f66c43/init/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.738300 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cf5bfd7f-hhq75_2e967299-2864-48a8-ba27-7d2a63f66c43/dnsmasq-dns/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.757699 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-665xk_abcaa620-a9bf-4edf-a044-ea75ca9fa872/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.893065 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8fbf2f2e-5205-4c3d-8b05-185404930c85/glance-log/0.log" Jan 25 06:39:39 crc kubenswrapper[4728]: I0125 06:39:39.894556 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8fbf2f2e-5205-4c3d-8b05-185404930c85/glance-httpd/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.023360 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_036ddc84-2b06-4817-9afd-537d8ed82150/glance-log/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.031946 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_036ddc84-2b06-4817-9afd-537d8ed82150/glance-httpd/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.105030 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-49vxd_c7e1052e-86fc-4070-be06-23fef77216d8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.190705 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qx5hk_4601c9b9-a8f1-49b4-ab17-e2d575ce1e2d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.442184 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29488681-j86cs_6197fee2-5bbd-4edd-bcb5-c10f476f4f83/keystone-cron/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.560048 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-64dbb5f568-n5f5j_eca4ef36-bc3a-42aa-8ab1-6a6cfdbee581/keystone-api/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.576410 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c45b9d32-afe0-490e-876d-64a9359773ff/kube-state-metrics/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.685394 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-8hxkd_99aea3d5-d496-457b-87b9-95c444db3c76/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.858054 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5162db63-7667-482e-a9bd-174365a318cc/memcached/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.964744 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54985bc57c-7dmw7_0d93e327-c397-427e-abe4-0065144bcb7a/neutron-api/0.log" Jan 25 06:39:40 crc kubenswrapper[4728]: I0125 06:39:40.997734 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54985bc57c-7dmw7_0d93e327-c397-427e-abe4-0065144bcb7a/neutron-httpd/0.log" Jan 25 06:39:41 crc kubenswrapper[4728]: I0125 06:39:41.073834 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wqbzb_6f8bee4e-2d23-4efc-81cc-e82bb4466eb4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:41 crc kubenswrapper[4728]: I0125 06:39:41.677239 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bf4fc010-98c5-4734-a9c9-3de4f1d1a34b/nova-api-log/0.log" Jan 25 06:39:41 crc kubenswrapper[4728]: I0125 06:39:41.703823 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_34d58c30-a0dd-40da-94b2-ab3cba2038ad/nova-cell0-conductor-conductor/0.log" Jan 25 06:39:41 crc kubenswrapper[4728]: I0125 06:39:41.711715 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bf4fc010-98c5-4734-a9c9-3de4f1d1a34b/nova-api-api/0.log" Jan 25 06:39:41 crc kubenswrapper[4728]: I0125 06:39:41.731971 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8ae38c3f-b058-49a8-8df8-5222dc364151/nova-cell1-conductor-conductor/0.log" Jan 25 06:39:41 crc kubenswrapper[4728]: I0125 06:39:41.894682 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-prsds_6b1b4c44-e390-4a48-aa8a-84f5509ef99e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:41 crc kubenswrapper[4728]: I0125 06:39:41.962967 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_99b7a342-4f5c-4977-b189-b0e4cf975704/nova-cell1-novncproxy-novncproxy/0.log" Jan 25 06:39:41 crc kubenswrapper[4728]: I0125 06:39:41.984718 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_56aa160e-7328-465e-8908-a78bb2fc8364/nova-metadata-log/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.189203 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_847abeb1-1f82-44cc-a876-2b8787688696/mysql-bootstrap/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.241222 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_086fe2f2-83d2-440c-bcde-f3d1bf8f21c8/nova-scheduler-scheduler/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.347889 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_847abeb1-1f82-44cc-a876-2b8787688696/galera/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.360750 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_847abeb1-1f82-44cc-a876-2b8787688696/mysql-bootstrap/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.484845 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04585afa-3da7-4da9-896a-2acc02ff910e/mysql-bootstrap/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.560675 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04585afa-3da7-4da9-896a-2acc02ff910e/mysql-bootstrap/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.591308 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04585afa-3da7-4da9-896a-2acc02ff910e/galera/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.657777 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_190d5aab-6cb5-4373-8e88-74ff4f94ca0e/openstackclient/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.801622 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6x4kp_193b75ba-c337-4422-88ce-aace97ac7638/ovn-controller/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.846603 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_56aa160e-7328-465e-8908-a78bb2fc8364/nova-metadata-metadata/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.871763 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n9vwb_37e45bde-b2c9-4617-a3e2-c0a1a5db3aa5/openstack-network-exporter/0.log" Jan 25 06:39:42 crc kubenswrapper[4728]: I0125 06:39:42.961302 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mr9hh_8a1328da-8d1f-4f1e-9f8b-d61559200740/ovsdb-server-init/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.104020 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mr9hh_8a1328da-8d1f-4f1e-9f8b-d61559200740/ovsdb-server/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.107851 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mr9hh_8a1328da-8d1f-4f1e-9f8b-d61559200740/ovs-vswitchd/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.109612 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mr9hh_8a1328da-8d1f-4f1e-9f8b-d61559200740/ovsdb-server-init/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.152205 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-tbq4s_81f0bf26-935c-4d92-aa59-8c2e22b87f2f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.282793 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe5324ca-4693-4d57-84e1-b2facac597bc/openstack-network-exporter/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.287199 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe5324ca-4693-4d57-84e1-b2facac597bc/ovn-northd/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.317549 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_52f6d754-4e79-4f05-9986-4abde93d34f0/openstack-network-exporter/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.435396 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_52f6d754-4e79-4f05-9986-4abde93d34f0/ovsdbserver-nb/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.454997 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ef293374-2620-4494-8bcf-6410e8a53342/ovsdbserver-sb/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.458250 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ef293374-2620-4494-8bcf-6410e8a53342/openstack-network-exporter/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.645564 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f74fbc68-hj87v_e537ee66-7c17-4eb1-a0ce-262f4c260d16/placement-api/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.665945 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f74fbc68-hj87v_e537ee66-7c17-4eb1-a0ce-262f4c260d16/placement-log/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.678599 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_718dab40-f0af-4030-8a9c-2a3a10aa4737/setup-container/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.809298 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_718dab40-f0af-4030-8a9c-2a3a10aa4737/rabbitmq/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.829897 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_718dab40-f0af-4030-8a9c-2a3a10aa4737/setup-container/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.840428 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4a9b861c-f271-4b2b-865e-925bf405c7d1/setup-container/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.973694 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4a9b861c-f271-4b2b-865e-925bf405c7d1/rabbitmq/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.977964 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4a9b861c-f271-4b2b-865e-925bf405c7d1/setup-container/0.log" Jan 25 06:39:43 crc kubenswrapper[4728]: I0125 06:39:43.993621 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-w9rhl_291a401e-d560-4e70-b979-57f86593a3b3/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.111764 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fdxpj_bee454bd-9662-4de3-ad06-204eaa3d2709/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.125283 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-q29s5_6c8ec845-8142-4a8d-95de-59cd6d159155/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.176197 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-shzgd_568f0cc6-4228-4797-b3af-aa2e43b30c83/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.275075 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-kxvkl_631e7d55-5830-4e5c-9ca0-65029b5b30af/ssh-known-hosts-edpm-deployment/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.410518 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b9459545f-6l97s_2a98851c-d86d-423f-a11c-a36fc78633a8/proxy-server/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.421604 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b9459545f-6l97s_2a98851c-d86d-423f-a11c-a36fc78633a8/proxy-httpd/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.460010 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-575nj_747ed3cf-861f-46d7-8411-3c3318fbff34/swift-ring-rebalance/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.565186 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/account-auditor/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.569150 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/account-reaper/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.614084 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/account-replicator/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.642529 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/account-server/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.716120 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/container-auditor/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.743701 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/container-server/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.778470 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/container-replicator/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.783641 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/container-updater/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.810497 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-auditor/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.883018 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-expirer/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.932190 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-replicator/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.936199 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-updater/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.949142 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/object-server/0.log" Jan 25 06:39:44 crc kubenswrapper[4728]: I0125 06:39:44.980996 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/rsync/0.log" Jan 25 06:39:45 crc kubenswrapper[4728]: I0125 06:39:45.037658 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f3c74720-9ea2-42cd-93d6-1c17ede15e62/swift-recon-cron/0.log" Jan 25 06:39:45 crc kubenswrapper[4728]: I0125 06:39:45.121181 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bw6st_46d9d0ef-f0c0-45c4-8497-5cca3ea0ff76/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:39:45 crc kubenswrapper[4728]: I0125 06:39:45.156195 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_29c48d20-6804-4826-89f8-2b3e00949942/tempest-tests-tempest-tests-runner/0.log" Jan 25 06:39:45 crc kubenswrapper[4728]: I0125 06:39:45.269927 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a9a9ae9b-52ea-4cc4-b645-f74946df2a17/test-operator-logs-container/0.log" Jan 25 06:39:45 crc kubenswrapper[4728]: I0125 06:39:45.380006 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-rvtbf_11ed206c-89ec-40be-ad2e-6217031ce033/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 25 06:40:02 crc kubenswrapper[4728]: I0125 06:40:02.812677 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/util/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.107817 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/util/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.117796 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/pull/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.152455 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/pull/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.273019 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/util/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.309583 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/pull/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.320495 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_35c37904d3a17d1a14bd1b3c4a14453aa908ae663076c617ee6579657fsvwqr_4946a9cb-495f-442c-b77d-9ff84ce2b795/extract/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.513098 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7f86f8796f-ztxz6_c906591a-0a65-447e-a795-aa7fb38c64bb/manager/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.524725 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7478f7dbf9-qgwgl_f171accd-8da2-4cf6-a195-536365fbeceb/manager/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.606663 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b45d7bf98-9gflx_13aad2b9-2318-480d-990b-e0627fa9b671/manager/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.775505 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78fdd796fd-f2dft_dde2fdf6-bca8-4c8d-ab78-4d7bd95785d6/manager/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.780526 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-pk7js_788a3e7e-9822-4c83-a7b8-0673f1dcbf6d/manager/0.log" Jan 25 06:40:03 crc kubenswrapper[4728]: I0125 06:40:03.864986 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-nkvhs_3fafd5fa-0f36-40b6-9fb1-f83c8799d8c6/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.047395 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-598f7747c9-p2f2m_694de73b-9b23-4f4c-a54d-bdd806df4e20/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.162841 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-694cf4f878-5dxmw_ad255789-2727-45f9-a389-fee59b5a141a/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.211529 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b8b6d4659-tzts2_700f794a-9dd3-4cea-bdb4-0f17e7faa246/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.338790 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78c6999f6f-ck9lm_5ac48f0b-9ef8-427e-b07c-2318e909b080/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.397255 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6b9fb5fdcb-62h6z_733ecefc-22d8-4a52-9540-09b4aac018e1/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.550910 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78d58447c5-b65wc_9db3c1e7-c92f-40d3-8ff9-ef86e1376688/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.611706 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7bdb645866-8kptt_5f8b10f8-34e5-4250-ade1-7d47b008a4d6/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.780051 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-848957f4b4kndnb_3ae373b2-dc3c-4c6d-b2bb-69a15bc1d52b/manager/0.log" Jan 25 06:40:04 crc kubenswrapper[4728]: I0125 06:40:04.794844 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4cd88d46-982pl_663cb342-06a8-4ee4-8e1b-6b2658e1781f/manager/0.log" Jan 25 06:40:05 crc kubenswrapper[4728]: I0125 06:40:05.105923 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-f6799c556-rtg62_2154b4ed-e610-40ed-8f77-cff0cf57d3a7/operator/0.log" Jan 25 06:40:05 crc kubenswrapper[4728]: I0125 06:40:05.176536 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wtmq9_f0c09b75-dc3c-4fa8-b310-d95a41ba1564/registry-server/0.log" Jan 25 06:40:05 crc kubenswrapper[4728]: I0125 06:40:05.328957 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f75f45d54-h4hvm_e0975e48-db18-44dc-99d7-524b381ad58d/manager/0.log" Jan 25 06:40:05 crc kubenswrapper[4728]: I0125 06:40:05.480565 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-79d5ccc684-mknzf_8e366d4f-b864-47e2-a289-19f97f76a38a/manager/0.log" Jan 25 06:40:05 crc kubenswrapper[4728]: I0125 06:40:05.632770 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dsqrk_cf80aae2-133f-475d-900a-13e8f1dec9ea/operator/0.log" Jan 25 06:40:05 crc kubenswrapper[4728]: I0125 06:40:05.812253 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-8hkv9_e13158ce-126d-4980-9fbd-e7ed492ee879/manager/0.log" Jan 25 06:40:05 crc kubenswrapper[4728]: I0125 06:40:05.927599 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65d46cfd44-wsx7f_585498aa-6031-43a2-ab1a-f52d1bef52e7/manager/0.log" Jan 25 06:40:05 crc kubenswrapper[4728]: I0125 06:40:05.953182 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-85cd9769bb-r6mh4_94a07441-42a8-4cc4-bbe9-f6cc6b5e8ac4/manager/0.log" Jan 25 06:40:06 crc kubenswrapper[4728]: I0125 06:40:06.030703 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-jhgs5_4e555f00-c133-4b06-b5df-005238b0541d/manager/0.log" Jan 25 06:40:06 crc kubenswrapper[4728]: I0125 06:40:06.114292 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-5lhdq_76e4e202-a355-4666-8e84-96486d73174c/manager/0.log" Jan 25 06:40:23 crc kubenswrapper[4728]: I0125 06:40:23.511361 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pb5ln_096f04e7-5491-45f6-9290-0a5bd7b7df49/control-plane-machine-set-operator/0.log" Jan 25 06:40:23 crc kubenswrapper[4728]: I0125 06:40:23.641620 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cz94k_652ad7d4-fb59-48ec-936b-305fa0b0966e/kube-rbac-proxy/0.log" Jan 25 06:40:23 crc kubenswrapper[4728]: I0125 06:40:23.670610 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cz94k_652ad7d4-fb59-48ec-936b-305fa0b0966e/machine-api-operator/0.log" Jan 25 06:40:34 crc kubenswrapper[4728]: I0125 06:40:34.123906 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-c5nfr_8060ef0d-4977-4b40-a26c-bded7ccbe72e/cert-manager-controller/0.log" Jan 25 06:40:34 crc kubenswrapper[4728]: I0125 06:40:34.257101 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-98wvg_c89a03e4-cc67-408c-93f8-7c0972ac36a8/cert-manager-cainjector/0.log" Jan 25 06:40:34 crc kubenswrapper[4728]: I0125 06:40:34.303023 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-mm5zm_e7eaed33-a3a8-45fd-b1be-9bec59f65967/cert-manager-webhook/0.log" Jan 25 06:40:43 crc kubenswrapper[4728]: I0125 06:40:43.967962 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-2s8cq_a5ea342d-9a20-4776-80b3-0132cefb2983/nmstate-console-plugin/0.log" Jan 25 06:40:44 crc kubenswrapper[4728]: I0125 06:40:44.118477 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6qpdd_417875b7-d358-4db4-ad01-1e31c98e4955/nmstate-handler/0.log" Jan 25 06:40:44 crc kubenswrapper[4728]: I0125 06:40:44.158232 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-zzzcv_90dfd7eb-b907-4cc3-95c5-69d9cb694372/kube-rbac-proxy/0.log" Jan 25 06:40:44 crc kubenswrapper[4728]: I0125 06:40:44.176102 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-zzzcv_90dfd7eb-b907-4cc3-95c5-69d9cb694372/nmstate-metrics/0.log" Jan 25 06:40:44 crc kubenswrapper[4728]: I0125 06:40:44.268788 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-cr8sf_8a8de132-1d94-4947-bc4e-0968643f10e0/nmstate-operator/0.log" Jan 25 06:40:44 crc kubenswrapper[4728]: I0125 06:40:44.348219 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-528j4_67c7adf0-d43f-47b5-8997-4d691eee4e4f/nmstate-webhook/0.log" Jan 25 06:41:05 crc kubenswrapper[4728]: I0125 06:41:05.783989 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-fdm97_d9fe7f50-6608-4a79-81f9-bdf8290d9d90/kube-rbac-proxy/0.log" Jan 25 06:41:05 crc kubenswrapper[4728]: I0125 06:41:05.836949 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-fdm97_d9fe7f50-6608-4a79-81f9-bdf8290d9d90/controller/0.log" Jan 25 06:41:05 crc kubenswrapper[4728]: I0125 06:41:05.958286 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-frr-files/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.104890 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-reloader/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.111842 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-frr-files/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.133813 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-metrics/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.139370 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-reloader/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.254247 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-frr-files/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.279070 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-reloader/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.286012 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-metrics/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.321380 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-metrics/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.470500 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-frr-files/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.474261 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-metrics/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.480920 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/cp-reloader/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.483030 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/controller/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.640627 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/frr-metrics/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.650818 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/kube-rbac-proxy/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.677398 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/kube-rbac-proxy-frr/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.848069 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-9r8h8_a77d36cb-ae0e-41c8-98be-85563d52e02c/frr-k8s-webhook-server/0.log" Jan 25 06:41:06 crc kubenswrapper[4728]: I0125 06:41:06.874774 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/reloader/0.log" Jan 25 06:41:07 crc kubenswrapper[4728]: I0125 06:41:07.114780 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b95c97db5-42zxg_3effc2ca-4fa0-4a2c-9f33-4eb0fe2d32ef/manager/0.log" Jan 25 06:41:07 crc kubenswrapper[4728]: I0125 06:41:07.268898 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6bb64544-ttk7k_0acd04dc-1416-47ec-97a0-f999c55e5efb/webhook-server/0.log" Jan 25 06:41:07 crc kubenswrapper[4728]: I0125 06:41:07.360798 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zrgnp_44533cf6-d8b9-4376-8aad-372d74dbeecd/kube-rbac-proxy/0.log" Jan 25 06:41:07 crc kubenswrapper[4728]: I0125 06:41:07.993857 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zrgnp_44533cf6-d8b9-4376-8aad-372d74dbeecd/speaker/0.log" Jan 25 06:41:08 crc kubenswrapper[4728]: I0125 06:41:08.086014 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-722sp_8730e55e-04b1-4da0-acc2-3f2ca701ba6a/frr/0.log" Jan 25 06:41:18 crc kubenswrapper[4728]: I0125 06:41:18.477732 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/util/0.log" Jan 25 06:41:18 crc kubenswrapper[4728]: I0125 06:41:18.696374 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/util/0.log" Jan 25 06:41:18 crc kubenswrapper[4728]: I0125 06:41:18.709660 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/pull/0.log" Jan 25 06:41:18 crc kubenswrapper[4728]: I0125 06:41:18.726852 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/pull/0.log" Jan 25 06:41:18 crc kubenswrapper[4728]: I0125 06:41:18.854223 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/util/0.log" Jan 25 06:41:18 crc kubenswrapper[4728]: I0125 06:41:18.856373 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/pull/0.log" Jan 25 06:41:18 crc kubenswrapper[4728]: I0125 06:41:18.876047 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcn8p29_ab67c2ed-9f47-4c5e-90bc-45dd5c7aa8d8/extract/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.157293 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/util/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.284629 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/pull/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.295423 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/util/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.307562 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/pull/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.477965 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/extract/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.480768 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/util/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.481640 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713t8mgm_11334404-2639-4444-a499-8312bc233ad6/pull/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.607310 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-utilities/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.760630 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-content/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.773642 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-content/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.779440 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-utilities/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.937230 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-utilities/0.log" Jan 25 06:41:19 crc kubenswrapper[4728]: I0125 06:41:19.949006 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/extract-content/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.118119 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-utilities/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.374574 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-utilities/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.395639 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-content/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.400866 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-content/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.556816 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7hjgs_0d0a6d26-536c-4931-9aa7-803fe8bb55a3/registry-server/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.564945 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-utilities/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.582014 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/extract-content/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.814126 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z726j_a549470e-be48-449d-b3e8-0caa23a23ee5/marketplace-operator/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.881466 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q7ldz_a10bfcaa-828b-444b-948b-1063ce7b114f/registry-server/0.log" Jan 25 06:41:20 crc kubenswrapper[4728]: I0125 06:41:20.964742 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-utilities/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.064266 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-utilities/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.086020 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-content/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.090534 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-content/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.230897 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-content/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.233189 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/extract-utilities/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.375387 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pss7z_ffca99f0-7a8a-4ca9-a0c8-9e8a8e1ea77e/registry-server/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.390199 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-utilities/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.603518 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-content/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.603685 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-content/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.619595 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-utilities/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.744498 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-utilities/0.log" Jan 25 06:41:21 crc kubenswrapper[4728]: I0125 06:41:21.750410 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/extract-content/0.log" Jan 25 06:41:22 crc kubenswrapper[4728]: I0125 06:41:22.198621 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khmzg_bad51ca6-3feb-4d91-b168-4330e2698fc1/registry-server/0.log" Jan 25 06:41:42 crc kubenswrapper[4728]: I0125 06:41:42.898675 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:41:42 crc kubenswrapper[4728]: I0125 06:41:42.899232 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:42:12 crc kubenswrapper[4728]: I0125 06:42:12.899108 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:42:12 crc kubenswrapper[4728]: I0125 06:42:12.900004 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:42:42 crc kubenswrapper[4728]: I0125 06:42:42.899638 4728 patch_prober.go:28] interesting pod/machine-config-daemon-w9dvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 25 06:42:42 crc kubenswrapper[4728]: I0125 06:42:42.900115 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 25 06:42:42 crc kubenswrapper[4728]: I0125 06:42:42.900168 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" Jan 25 06:42:42 crc kubenswrapper[4728]: I0125 06:42:42.900937 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6"} pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 25 06:42:42 crc kubenswrapper[4728]: I0125 06:42:42.900980 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerName="machine-config-daemon" containerID="cri-o://d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" gracePeriod=600 Jan 25 06:42:43 crc kubenswrapper[4728]: E0125 06:42:43.024766 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:42:43 crc kubenswrapper[4728]: I0125 06:42:43.307902 4728 generic.go:334] "Generic (PLEG): container finished" podID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" exitCode=0 Jan 25 06:42:43 crc kubenswrapper[4728]: I0125 06:42:43.307943 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" event={"ID":"d10b5a2b-cd5b-4f07-a2a3-06c2c8437002","Type":"ContainerDied","Data":"d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6"} Jan 25 06:42:43 crc kubenswrapper[4728]: I0125 06:42:43.307975 4728 scope.go:117] "RemoveContainer" containerID="2c6cf23d9a194e48566ef43c8f953376fac69e4557ba6f445f007017a1a37521" Jan 25 06:42:43 crc kubenswrapper[4728]: I0125 06:42:43.308495 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:42:43 crc kubenswrapper[4728]: E0125 06:42:43.308871 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:42:46 crc kubenswrapper[4728]: I0125 06:42:46.339822 4728 generic.go:334] "Generic (PLEG): container finished" podID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerID="dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6" exitCode=0 Jan 25 06:42:46 crc kubenswrapper[4728]: I0125 06:42:46.339934 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-89s2s/must-gather-b4vcn" event={"ID":"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2","Type":"ContainerDied","Data":"dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6"} Jan 25 06:42:46 crc kubenswrapper[4728]: I0125 06:42:46.341846 4728 scope.go:117] "RemoveContainer" containerID="dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6" Jan 25 06:42:46 crc kubenswrapper[4728]: I0125 06:42:46.948187 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-89s2s_must-gather-b4vcn_dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2/gather/0.log" Jan 25 06:42:56 crc kubenswrapper[4728]: I0125 06:42:56.328548 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:42:56 crc kubenswrapper[4728]: E0125 06:42:56.329241 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:42:56 crc kubenswrapper[4728]: I0125 06:42:56.686103 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-89s2s/must-gather-b4vcn"] Jan 25 06:42:56 crc kubenswrapper[4728]: I0125 06:42:56.686398 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-89s2s/must-gather-b4vcn" podUID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerName="copy" containerID="cri-o://1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1" gracePeriod=2 Jan 25 06:42:56 crc kubenswrapper[4728]: I0125 06:42:56.691856 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-89s2s/must-gather-b4vcn"] Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.118822 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-89s2s_must-gather-b4vcn_dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2/copy/0.log" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.119448 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.240266 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqrl7\" (UniqueName: \"kubernetes.io/projected/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-kube-api-access-hqrl7\") pod \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\" (UID: \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\") " Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.240425 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-must-gather-output\") pod \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\" (UID: \"dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2\") " Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.254868 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-kube-api-access-hqrl7" (OuterVolumeSpecName: "kube-api-access-hqrl7") pod "dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" (UID: "dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2"). InnerVolumeSpecName "kube-api-access-hqrl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.343360 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqrl7\" (UniqueName: \"kubernetes.io/projected/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-kube-api-access-hqrl7\") on node \"crc\" DevicePath \"\"" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.356779 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" (UID: "dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.444990 4728 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.450733 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-89s2s_must-gather-b4vcn_dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2/copy/0.log" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.451291 4728 generic.go:334] "Generic (PLEG): container finished" podID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerID="1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1" exitCode=143 Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.451362 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-89s2s/must-gather-b4vcn" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.451380 4728 scope.go:117] "RemoveContainer" containerID="1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.469892 4728 scope.go:117] "RemoveContainer" containerID="dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.525667 4728 scope.go:117] "RemoveContainer" containerID="1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1" Jan 25 06:42:57 crc kubenswrapper[4728]: E0125 06:42:57.526183 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1\": container with ID starting with 1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1 not found: ID does not exist" containerID="1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.526238 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1"} err="failed to get container status \"1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1\": rpc error: code = NotFound desc = could not find container \"1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1\": container with ID starting with 1a299ac6f74375c594d7489eddbe8974333034278560f256e0b33687d93f32b1 not found: ID does not exist" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.526266 4728 scope.go:117] "RemoveContainer" containerID="dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6" Jan 25 06:42:57 crc kubenswrapper[4728]: E0125 06:42:57.526628 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6\": container with ID starting with dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6 not found: ID does not exist" containerID="dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6" Jan 25 06:42:57 crc kubenswrapper[4728]: I0125 06:42:57.526672 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6"} err="failed to get container status \"dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6\": rpc error: code = NotFound desc = could not find container \"dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6\": container with ID starting with dbbbc6e7bf6c4dd50dc79b7eba9a6908569f912d1732f1e230de25dee58942f6 not found: ID does not exist" Jan 25 06:42:59 crc kubenswrapper[4728]: I0125 06:42:59.337638 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" path="/var/lib/kubelet/pods/dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2/volumes" Jan 25 06:43:09 crc kubenswrapper[4728]: I0125 06:43:09.333989 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:43:09 crc kubenswrapper[4728]: E0125 06:43:09.334914 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:43:22 crc kubenswrapper[4728]: I0125 06:43:22.329599 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:43:22 crc kubenswrapper[4728]: E0125 06:43:22.330916 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:43:37 crc kubenswrapper[4728]: I0125 06:43:37.328946 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:43:37 crc kubenswrapper[4728]: E0125 06:43:37.329757 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:43:50 crc kubenswrapper[4728]: I0125 06:43:50.328987 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:43:50 crc kubenswrapper[4728]: E0125 06:43:50.329792 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:44:01 crc kubenswrapper[4728]: I0125 06:44:01.328784 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:44:01 crc kubenswrapper[4728]: E0125 06:44:01.330004 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:44:14 crc kubenswrapper[4728]: I0125 06:44:14.329172 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:44:14 crc kubenswrapper[4728]: E0125 06:44:14.330021 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:44:25 crc kubenswrapper[4728]: I0125 06:44:25.328804 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:44:25 crc kubenswrapper[4728]: E0125 06:44:25.329472 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:44:38 crc kubenswrapper[4728]: I0125 06:44:38.329703 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:44:38 crc kubenswrapper[4728]: E0125 06:44:38.330457 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:44:51 crc kubenswrapper[4728]: I0125 06:44:51.329626 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:44:51 crc kubenswrapper[4728]: E0125 06:44:51.330178 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.157214 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59"] Jan 25 06:45:00 crc kubenswrapper[4728]: E0125 06:45:00.158036 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerName="gather" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.158050 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerName="gather" Jan 25 06:45:00 crc kubenswrapper[4728]: E0125 06:45:00.158067 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerName="copy" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.158072 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerName="copy" Jan 25 06:45:00 crc kubenswrapper[4728]: E0125 06:45:00.158081 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88cdb1b-50b6-4be8-9bd0-c11d2a431960" containerName="container-00" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.158087 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88cdb1b-50b6-4be8-9bd0-c11d2a431960" containerName="container-00" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.158249 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88cdb1b-50b6-4be8-9bd0-c11d2a431960" containerName="container-00" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.158258 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerName="copy" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.158270 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb0b5a0-ca20-4573-ab0d-ade3f74c9cc2" containerName="gather" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.158888 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.160641 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.160675 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.163719 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59"] Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.342508 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-secret-volume\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.342545 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-config-volume\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.342965 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-kube-api-access-6hw4l\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.445223 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-kube-api-access-6hw4l\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.445378 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-secret-volume\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.445404 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-config-volume\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.446104 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-config-volume\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.449873 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-secret-volume\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.459463 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-kube-api-access-6hw4l\") pod \"collect-profiles-29488725-sbg59\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.474030 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:00 crc kubenswrapper[4728]: I0125 06:45:00.845892 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59"] Jan 25 06:45:01 crc kubenswrapper[4728]: I0125 06:45:01.336150 4728 generic.go:334] "Generic (PLEG): container finished" podID="14fd5ed8-e89d-4d3b-98b5-57e2da7b7190" containerID="3ee776ef1de14b8a8f29d3d8760f94a3eea4b29360ff00bb09f234494e9d2f8b" exitCode=0 Jan 25 06:45:01 crc kubenswrapper[4728]: I0125 06:45:01.336184 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" event={"ID":"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190","Type":"ContainerDied","Data":"3ee776ef1de14b8a8f29d3d8760f94a3eea4b29360ff00bb09f234494e9d2f8b"} Jan 25 06:45:01 crc kubenswrapper[4728]: I0125 06:45:01.336202 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" event={"ID":"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190","Type":"ContainerStarted","Data":"10b8e51fa4b85dafe5b1363fcdc48370914c3c313c0e762b80e2e974fac39dca"} Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.603628 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.779114 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-secret-volume\") pod \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.779455 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-kube-api-access-6hw4l\") pod \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.779511 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-config-volume\") pod \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\" (UID: \"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190\") " Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.780140 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-config-volume" (OuterVolumeSpecName: "config-volume") pod "14fd5ed8-e89d-4d3b-98b5-57e2da7b7190" (UID: "14fd5ed8-e89d-4d3b-98b5-57e2da7b7190"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.784374 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-kube-api-access-6hw4l" (OuterVolumeSpecName: "kube-api-access-6hw4l") pod "14fd5ed8-e89d-4d3b-98b5-57e2da7b7190" (UID: "14fd5ed8-e89d-4d3b-98b5-57e2da7b7190"). InnerVolumeSpecName "kube-api-access-6hw4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.784749 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "14fd5ed8-e89d-4d3b-98b5-57e2da7b7190" (UID: "14fd5ed8-e89d-4d3b-98b5-57e2da7b7190"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.880735 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hw4l\" (UniqueName: \"kubernetes.io/projected/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-kube-api-access-6hw4l\") on node \"crc\" DevicePath \"\"" Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.880763 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-config-volume\") on node \"crc\" DevicePath \"\"" Jan 25 06:45:02 crc kubenswrapper[4728]: I0125 06:45:02.880772 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14fd5ed8-e89d-4d3b-98b5-57e2da7b7190-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 25 06:45:03 crc kubenswrapper[4728]: I0125 06:45:03.348219 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" event={"ID":"14fd5ed8-e89d-4d3b-98b5-57e2da7b7190","Type":"ContainerDied","Data":"10b8e51fa4b85dafe5b1363fcdc48370914c3c313c0e762b80e2e974fac39dca"} Jan 25 06:45:03 crc kubenswrapper[4728]: I0125 06:45:03.348246 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29488725-sbg59" Jan 25 06:45:03 crc kubenswrapper[4728]: I0125 06:45:03.348250 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10b8e51fa4b85dafe5b1363fcdc48370914c3c313c0e762b80e2e974fac39dca" Jan 25 06:45:03 crc kubenswrapper[4728]: I0125 06:45:03.655769 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt"] Jan 25 06:45:03 crc kubenswrapper[4728]: I0125 06:45:03.662996 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29488680-9zlxt"] Jan 25 06:45:05 crc kubenswrapper[4728]: I0125 06:45:05.328863 4728 scope.go:117] "RemoveContainer" containerID="d9b89c554190dac3f43ba0ec83ce5e14c20e3c640d48b7f4058fff1c40840df6" Jan 25 06:45:05 crc kubenswrapper[4728]: E0125 06:45:05.329480 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9dvd_openshift-machine-config-operator(d10b5a2b-cd5b-4f07-a2a3-06c2c8437002)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9dvd" podUID="d10b5a2b-cd5b-4f07-a2a3-06c2c8437002" Jan 25 06:45:05 crc kubenswrapper[4728]: I0125 06:45:05.337510 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45dfb47-684c-467a-ab58-23d5bcd3bc7c" path="/var/lib/kubelet/pods/e45dfb47-684c-467a-ab58-23d5bcd3bc7c/volumes"